How to Create a Test Strategy

Slides:



Advertisements
Similar presentations
Test process essentials Riitta Viitamäki,
Advertisements

Ossi Taipale, Lappeenranta University of Technology
Software Quality Assurance Plan
AXIOMS Paul Gerrard THE TESTING OF Advancing Testing Using Axioms.
SE 555 Software Requirements & Specification Requirements Validation.
Pertemuan Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
Computer Security: Principles and Practice
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Management Adrian Marshall.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Chapter 24 - Quality Management
Slide 1 Test Assurance – Ensuring Stakeholders get What They Want Paul Gerrard Gerrard Consulting PO Box 347 Maidenhead Berkshire SL6 2GU UK e:
Slide 1 Intelligent Testing, Improvement and Assurance Susan Windsor Principal Gerrard Consulting Limited +44 (0)
Enterprise Architecture
Release & Deployment ITIL Version 3
What is Business Analysis Planning & Monitoring?
 A project is “a unique endeavor to produce a set of deliverables within clearly specified time, cost and quality constraints”
S/W Project Management
Introduction to Software Quality Assurance (SQA)
Software Project Management Lecture # 8. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
1 Shawlands Academy Higher Computing Software Development Unit.
Software Testing Lifecycle Practice
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Test Organization and Management
N By: Md Rezaul Huda Reza n
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 6 Slide 1 Requirements Engineering Processes l Processes used to discover, analyse and.
Independent User Acceptance Test Process (IUAT)
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
Service Transition & Planning Service Validation & Testing
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Risk-Based Testing – An Overview Assurance with IntelligenceSlide 1 Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e:
IT Requirements Management Balancing Needs and Expectations.
Strong9 Consulting Services, LLC 1 PMI - SVC I-80 Breakfast Roundtable Monthly Meeting Thursday, October 12, :00 am – 9:00 am.
AXIOMS Paul Gerrard THE TESTING OF.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
1 Chapter Nine Conducting the IT Audit Lecture Outline Audit Standards IT Audit Life Cycle Four Main Types of IT Audits Using COBIT to Perform an Audit.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 TenStep Project Management Process ™ PM00.9 PM00.9 Project Management Preparation for Success * Manage Quality *
The Software Development Process
Chair of Software Engineering Exercise Session 6: V & V Software Engineering Prof. Dr. Bertrand Meyer March–June 2007.
Project Management Workshop James Small. Goals Understand the nature of projects Understand why Project Management is important Get an idea of the key.
BSBPMG404A Apply Quality Management Techniques Apply Quality Management Techniques Project Quality Processes C ertificate IV in Project Management
Chapter 1: Fundamental of Testing Systems Testing & Evaluation (MNN1063)
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
Software Project Management Lecture # 9. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Ensuring the Safety of Future Developments
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
44222: Information Systems Development
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Project Management Enabling Quality Marien de Wilde, PMP April 2007.
Week # 4 Quality Assurance Software Quality Engineering 1.
Testing throughout Lifecycle Ljudmilla Karu. Verification and validation (V&V) Verification is defined as the process of evaluating a system or component.
 System Requirement Specification and System Planning.
Computer Security: Principles and Practice First Edition by William Stallings and Lawrie Brown Lecture slides by Lawrie Brown Chapter 17 – IT Security.
Project Management PTM721S
Chapter 10 Software Quality Assurance& Test Plan Software Testing
Guidance notes for Project Manager
Lecture 09:Software Testing
Portfolio, Programme and Project
Software Testing Lifecycle Practice
Presentation transcript:

How to Create a Test Strategy Intelligent Assurance and Testing 13-Apr-17 How to Create a Test Strategy Paul Gerrard paul@gerrardconsulting.com Twitter: @paul_gerrard Web: gerrardconsulting.com

Intelligent Testing and Assurance Paul Gerrard Paul Gerrard is a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach and a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them. Educated at the universities of Oxford and Imperial College London, in 2010, Paul won the Eurostar European Testing excellence Award. In 2012, with Susan Windsor, Paul recently co-authored “The Business Story Pocketbook”. He is Principal of Gerrard Consulting Limited and is the host of the UK Test Management Forum and the UK Business Analysis Forum. Intelligent Testing and Assurance

Intelligent Testing and Assurance Agenda Your test strategy challenges? What is test strategy? Test strategy approach Test axioms as thinking tools Using the First Equation of Testing Testing in staged projects Goals, risks and designing the test process Goals, risks and coverage-based test reporting Communicating test strategies Case study exercise Your challenges revisited (if we have time). NB some slides are hidden and won’t be presented today. Intelligent Testing and Assurance

Overview This is a full day course (or 2 day, customised for clients) I won’t talk over every slide We might not get through the whole Case Study exercise in detail But you WILL get some practice in many of the Q&A and conversations about test strategy.

What Are Your Test Strategy Challenges?

What Are Your Test Strategy Challenges? Ever written a test strategy that no one read?

What is Test Strategy? Dig a Hole? Test a system?

Test strategy answers questions Before you can plan a test, you need to have a lot of questions answered The strategy… Presents some decisions that can be made ahead of time Defines the process or method or information that will allow decisions to be made (in project) Sets out the principles (or process) to follow for uncertain situations or unplanned events. Intelligent Testing and Assurance

Test Strategy ‘evolution’ Success-based: test to show it works Defect-based: test to find bugs Coverage-based: analyse requirements and code to achieve test coverage targets Risk-based: use risk to focus testing and to inform risk assessment Goal-based: use business goals and risk to focus testing and support decision-making. FOCUS: from programmer  tester  stakeholder Intelligent Testing and Assurance

Intelligent Testing and Assurance Goal-Based Testing Multiple objectives in multiple test stages Objectives vary with product readiness (testing in the small  testing in the large) Objectives vary by role Developer  tester  user  sponsor Multiple stakeholders: sponsor, users, developers, support staff, project management Different test models for each test objective Projects are unique, goals, risks and therefore the test approach is unique too. Intelligent Testing and Assurance

Test Strategy defines… Who are the testing stakeholders What are their concerns and how their concerns will be used to scope testing What, when and how testing will report to stakeholders Who will perform the testing and reporting The human and technical resources required to deliver What might change, what could go wrong and how testers can recover the situation. Intelligent Testing and Assurance

Who are the stakeholders? Those who focus on risk and business goals: Sponsors, project stakeholders Business Users Project management Those who focused on contractual aspects, stage payments etc: Software suppliers Contracts people. Intelligent Testing and Assurance

Who are the stakeholders? 2 Those who focus on their risks and responsibilities: Suppliers Developers System Test Team UAT Team Those who focus on meeting business and technical requirements: Technical Architects Operations Technical Support. Intelligent Testing and Assurance

Test Strategy and Approach Strategy is a thought process not a document

Intelligent Testing and Assurance Test Strategy ToC from 1994 1. Introduction 1.1. Version History 1.2. Purpose 1.3. Scope 1.4. Background 1.5. Assumptions and Dependencies 1.6. Summary 2. Overview of User Testing 2.1. Objectives 2.2. User Testing and the Overall Project 2.3. Functional Requirements Testing 2.4. The Need for Technical Requirements Testing 2.5. Starting Early - Front-loading 3. User Test Policy 3.1. Baseline for Testing 3.2. Contract 3.2.1. Acceptance Criteria 3.2.2. XXX and Supplier Responsibilities 3.2.3. Testing and QA 3.2.4. Quality Plan 3.3. Testing Criteria 3.4. Risk Criteria 3.5. Starting Criteria 3.6. Policy for Re-Testing 3.7. Policy for Regression Testing 3.8. Completion Criteria 3.9. Handover to Production 3.10. Documentation Plan 3.10.1. User Test Strategy 3.10.2. Test Plan 3.10.3. Test Log 3.10.4. Incident Log 3.10.5. Error Log 3.10.6. Test Report 4. Functional Requirements Testing 4.1. Approach 4.2. Process 4.3. Special Application Needs 5. Technical Requirements Testing 5.1. Usability testing 5.1.1. Requirements 5.1.2. Conducting Usability Tests 5.2. Performance testing 5.2.1. Requirements for Performance Testing 5.2.2. Performance Test Cases 5.3. Conversion Testing 5.4. Security testing 5.4.1. Security Tests 5.4.2. Security Test Cases 5.5. Documentation Testing 5.6. Volume testing 5.7. Stress testing 5.8. Storage testing 5.9. Recovery Testing 5.10. Installation testing 5.11. Reliability Testing 5.12. Serviceability Testing 5.13. Portability Testing 5.14. Tests Not Required by the Users 6. User Test Infrastructure 6.1. Test Environment 6.1.1. Support 6.1.2. Roles and Responsibilities 6.1.3. Test Environment 6.2. Tools and Automation 6.2.1. Comparators 6.2.2. Test Data Generators 6.2.3. Capture/Replay Tools 6.2.4. Testing Information Systems 6.2.5. Database Query and Maintenance Facilities 6.2.6. Transaction Simulators 7. Schedule 7.1. Milestone Plan 7.2. Activities to be Resourced 7.3. Skills Required 8. User Test Execution 8.1. Acceptance Test Procedure 8.1.1. Pre-Test Meeting 8.1.2. During the Test 8.1.3. Post-Test Meeting 8.2. Software Delivery 8.3. Testing to Plan 8.4. Handling Failures 8.5. Logging Tests 8.6. Error Classification 8.7. Controlling Releases of New Versions 8.8. Regression Testing 8.9. Documentation This is the table of contents of a 51 page document for an acceptance test of an outsourced development (written by me) A large safety-related system might have 150 pages of test strategy supported by 10-20 other risk-related documents “Does size matter?” Intelligent Testing and Assurance

Contexts of Test Strategy Axioms Communication Early Testing Risks De-Duplication Test Strategy Opportunities Goals Automation Culture Contract User involvement Constraints Human resource Artefacts Skills Environment Process (lack of?) Timescales Intelligent Testing and Assurance

Introducing the Test Axioms

Intelligent Testing and Assurance Test Axioms Formulated as a context-neutral set of rules for testing systems They represent the critical thinking processes required to test any system There are clear opportunities to advance the practice of testing using them Testers Pocketbook: testers-pocketbook.com Test Axioms Website test-axioms.com Intelligent Testing and Assurance

How can we use Test Axioms? Test Axioms are not beginners guides They can help you to think critically about testing They expose flaws in other people’s thinking and their arguments about testing They generate some useful by-products They help you to separate context from values Interesting research areas! First Equation of Testing, Testing Uncertainty Principle, Quantum Theory, Relativity, Exclusion Principle... You can tell I like physics Intelligent Testing and Assurance

The Axioms are thinking tools

Intelligent Testing and Assurance The Axioms Stakeholder Basis Oracle Fallibility Scope Value Coverage Never-Finished Delivery Good-Enough Environment Repeat-Test Event Design Prioritisation Sequencing Intelligent Testing and Assurance

Intelligent Testing and Assurance Grouping the Axioms Stakeholder Basis Oracle Fallibility Scope Value Coverage Never-Finished Delivery Good-Enough Environment Repeat-Test Event Design Prioritisation Sequencing Intelligent Testing and Assurance

Intelligent Testing and Assurance The three axiom groups Stakeholder Value Scope Fallibility Good-Enough Delivery Repeat-Test Sequence Environment Event Never-finished Design Basis Coverage Prioritisation Oracle Intelligent Testing and Assurance

Testing needs stakeholders (p64) Summary: Identify and engage the people or organisations that will use and benefit from the test evidence we are to provide Consequence if ignored or violated: There will be no mandate or any authority for testing. Reports of passes, fails or enquiries have no audience. Questions: Who are they? Whose interests do they represent? What evidence do they want? What do they need it for? When do they want it? In what format? How often? Intelligent Testing and Assurance

Test design is based on models (p68) Summary: Choose test models to derive tests that are meaningful to stakeholders. Recognise the models’ limitations and the assumptions that the models make Consequence if ignored or violated: Tests design will be meaningless and not credible to stakeholders. Questions Are design models available to use as test models? Are they mandatory? What test models could be used to derive tests from the Test Basis? Which test models will be used? Are test models to be documented or are they purely mental models? What are the benefits of using these models? What simplifying assumptions do these models make? How will these models contribute to the delivery of evidence useful to the acceptance decision makers? How will these models combine to provide sufficient evidence without excessive duplication? How will the number of tests derived from models be bounded? Intelligent Testing and Assurance

Test execution requires a known, controlled environment (p77) Summary: Establish the need and requirements for an environment and test data to be used for testing, including a mechanism for managing changes to that environment – in good time. Consequence if ignored or violated: Environments are not available in time or are unsuitable for testing. This will delay testing or cause tests to be run in the wrong environment and undermine the credibility of evidence produced. Questions Who is responsible for the acquisition, configuration and support of test environments? What assumptions regarding test environments do our test models make? How will requirements for test environments be articulated, negotiated? How will the validity and usability of test environments be assured? How will changes to environments be managed, consistent with changes in requirements and other deliverables under test? How will the state of environments, including backed up and restored versions be managed? Intelligent Testing and Assurance

Creating a Test Strategy

First Equation of Testing Axioms Context Not an equation in the mathematical sense Need to consider three key aspects and do a lot of thinking Values Thinking + Approach Intelligent Testing and Assurance

Why is the equation useful? Separation of Axioms, context, values and thinking Tools, methodologies, certification, maturity models promote approaches without reference to your context or values No thinking is required! Without a unifying test theory you have no objective way of assessing these products. Intelligent Testing and Assurance

One context, multiple approaches Given context, practitioners can promote different approaches based on their values Values are preferences or beliefs Pre-planned v exploratory Predefined v custom process Requirements-driven v goal-based Standard documentation v face-to-face comms. Some contexts preclude certain practices “No best practices” Intelligent Testing and Assurance

Axioms allow (ensure) different approaches and expose positions Separating axioms, context and values clarifies positions, for example: ‘Structured’ test advocates have little (useful) to say about Agile contexts Exploratory test advocates have little (useful) to say about contract/requirements-based acceptance The disputes between these positions is more about values than practices in context. Intelligent Testing and Assurance

Testing in Staged Projects Intelligent Assurance and Testing Testing in Staged Projects The V-Model, W-Model and Goal-Based Approaches (PAUL hand over to NEIL)

Intelligent Assurance and Testing V-Model Requirements User Acceptance Test Is there ever a one-to-one relationship between baseline documents and testing? Functional Specification System Test Physical Design Integration Test Where is the static testing (reviews, inspections, static analysis etc.)? NEIL: Now, you might be thinking, isn’t the V-model old-fashioned, doesn’t this risk-based testing replace it? Our answers are yes and no: yes, the V-model is old-fashioned, but it need not be as rigid as many people think, and no, risk-based testing doesn’t replace it as such, but we base our approach on an enhancement of the V-model. So what’s wrong with the basic V-model? Many criticisms have been made of it, but we pick in two here: [ANIMATION 1] one-to-one arrows [ANIMATION 2] no visibility of static testing Our enhancement of the V-model we call... [NEXT SLIDE] <2 MINUTES> Program Specification Unit Test

Intelligent Assurance and Testing Work products Project documents: schedule, quality plan, test strategy, standards Deliverables: requirements, designs, specifications, user documentation, procedures software: custom built or COTS components, sub-systems, systems, interfaces infrastructure: hardware, O/S, network, DBMS transition plans, conversion software, training... NEIL: In describing the W-model we emphasised how all work products should get tested somewhere, but not necessarily in a rigid, one-to-one correspondence with test stages In the widest sense, everything on this slide is a work product But in the same way as we distinguished (in an earlier slide) product risks from project & process risks, it’s worth clarifying our scope as testers by distinguishing those work products - we call them deliverables for convenience - which are the basis for risk-based testing The risks associated with project documents are project & process risks, and are the concern of project management and quality management (DON’T GET BOGGED INTO QA V QC) The risks of faults occurring in deliverables, whether they be documents or executable software, are what we base testing on. <1 MINUTE> Intelligent Testing and Assurance

What do we mean by testing? Intelligent Assurance and Testing What do we mean by testing? Testing is the process of evaluating the deliverables of a software project detect faults so they can be removed demonstrate products meet their requirements gain confidence that products are ready for use measure and reduce risk Testing includes: static tests: reviews, inspections etc. dynamic tests: unit, system, acceptance tests etc. NEIL: So... to summarise our introduction, Part I of today’s tutorial... (QUESTION FOR PAUL: DOES TESTING TEST PROJECT SCHEDULE, QUALITY PLAN, TEST STRATEGY, STANDARDS ETC? I GUESS NOT, BUT IF YOU MEAN IT TO, WE NEED TO CHANGE deliverables BACK TO products ON THIS SLIDE AND I NEED TO CHANGE EXPLANATION OF PREVIOUS SLIDE) <HALF MINUTE> Intelligent Testing and Assurance

Intelligent Assurance and Testing W-Model Write Requirements Test the Requirements Install System Acceptance Test Specify System Test the Specification Build System System Test Design System Test the Design Build Software Integration Test Write Code Unit Test NEIL: ... the W-model. It retains the correspondence of test activities to specification and software development activities, which is the basis of many project & quality management methods, but: it adds static testing to the overall framework for verification and validation (this includes the practice of early test preparation as part of “test the requirements, test the specification etc”); and it subtracts the rigid arrows between test activities & baseline documents. We might not have a full set of those documents, and some of those we do have may be out-of-date or faulty, so me may need for example to base system testing on elements of the original requirements and the design in addition to the system spec. In extreme cases some of those levels might be only in people’s heads, yet we are still asked to test. This is one aspect of exploratory testing, which is receiving a lot of attention currently. To sum up: it’s a major point in the W-model that all work products get tested somewhere, whether those products are actual software or “merely” documents <3 MINUTES>

W-Model and static testing Intelligent Assurance and Testing W-Model and static testing Requirements Animation Write Requirements Test the Requirements Install System Acceptance Test Early Test Case Preparation Scenario Walkthroughs Specify System Test the Specification Build System System Test Reviews Inspections Design System Test the Design Build Software Integration Test Write Code Unit Test NEIL: Here are some examples of static testing, in the broad sense <HALF MINUTE> Static Analysis Inspection

W-Model and dynamic testing Intelligent Assurance and Testing W-Model and dynamic testing Write Requirements Test the Requirements Install System Acceptance Test Business Integration Testing System Integration Testing Specify System Test the Specification Build System System Test Performance Testing Usability Testing Design System Test the Design Build Software Integration Test Security Testing Boundary Value Testing Equivalence Partitioning Exploratory Testing Write Code Unit Test NEIL: ...and here are some examples of dynamic test types, and approaches & techniques IF ANYONE ASKS: path testing, equivalence partitioning & boundary value testing are techniques usability, security, performance, SIT & BIT are techniques which are mapped, for e-business testing in particular, to test types in RBEBT exploratory testing is a wider approach <1 MINUTE> Path Testing

It’s usually more complicated: A real high level test plan Intelligent Assurance and Testing 13-Apr-17 It’s usually more complicated: A real high level test plan There were three streams of testing: Core systems – this comprised the SAP modules and new applications and interfaces Non-core systems were enhancements and changes to existing legacy systems All systems – this is the integrated whole tested as a system of systems. FDR is Full Dress Rehearsal (for the cutover) Programme managed

Intelligent Assurance and Testing Goals The fundamental business objectives of the system(s) to be built, implemented and used The benefits of undertaking the project The payoff(s) that underpin and justify the project Risks are what threaten the goals of a project. (PAUL) Intelligent Testing and Assurance

Goal Based Test Strategy The test strategy must set out how: Achievements (the goals) of a project are evidenced or demonstrated The risks that threaten goals will be explored, re-assessed and deemed acceptable (or not) We need to understand the goals and how achievement will be measured We need to understand (in particular, product) risk and how they are explored and exposed. Intelligent Testing and Assurance

A goal network (aka results chain or logic model) Every project has a network of dependent interim and ultimate goals threatened by risks The ultimate business goal Your strategy will identify the test activities that will measure goal achievement and evidence these risks GOAL RISK Intelligent Testing and Assurance

Intelligent Assurance and Testing Introduction to Risk (PAUL)

Intelligent Assurance and Testing The definition of risk Italian dictionary: Risicare, “to dare” Simple generic definition: “The probability that undesirable events will occur” In this tutorial, we will use this definition: “A risk threatens one or more of a project’s goals and has an uncertain probability” (PAUL) Intelligent Testing and Assurance

Some general statements about risk Intelligent Assurance and Testing Some general statements about risk Risks only exist where there is uncertainty If the probability of a risk is zero or 100%, it is not a risk Unless there is the potential for loss, there is no risk (“nothing ventured, nothing gained”) There are risks associated with every project Software development is inherently risky. (PAUL) Intelligent Testing and Assurance

Three types of software risk Intelligent Assurance and Testing Three types of software risk Project Risk resource constraints, external interfaces, supplier relationships, contract restrictions Process Risk variances in planning and estimation, shortfalls in staffing, failure to track progress, lack of quality assurance and configuration management Primarily a management responsibility Planning and the development process are the main issues here. Product Risk lack of requirements stability, complexity, design quality, coding quality, non-functional issues, test specifications. Testers are mainly concerned with Product Risk (PAUL) Requirements risks are the most significant risks reported in risk assessments.

Risk Management Process Intelligent Assurance and Testing Risk Management Process (PAUL)

Intelligent Assurance and Testing Process Risk identification what are the risks to be addressed? Risk analysis nature, probability, consequences, exposure Risk response planning pre-emptive or reactive risk reduction measures Risk resolution and monitoring Stakeholders should be involved at all stages. (PAUL) Intelligent Testing and Assurance

Intelligent Assurance and Testing The danger slope Where we want to move all risks (PAUL)

Risk response planning Intelligent Assurance and Testing Risk response planning Do nothing! Pre-emptive risk reduction measures information buying process model risk influencing contractual transfer Reactive risk reduction measures contingency plans insurance But this all sounds highly theoretical – we could never get this to work in my company! Where testing fits in (PAUL) Intelligent Testing and Assurance

Even penguins know how to manage risk!

Role of Testing in Product Risk Management Intelligent Assurance and Testing Role of Testing in Product Risk Management (PAUL)

Faults, failure and risk Intelligent Assurance and Testing Faults, failure and risk System failures are what we fear The faults that cause failures are our prey Uncertainty is what makes us concerned: what type of faults are present in the system? how many faults are in the system? did testing remove all the serious faults? Testing helps us to address these uncertainties. (PAUL) Intelligent Testing and Assurance

Testing helps to reduce risk Intelligent Assurance and Testing Testing helps to reduce risk If risk assessment steers test activity we design tests to detect faults we reduce the risks caused by faulty products Faults found early reduce rework, cost and time lost in later stages Faults found are corrected and re-tested and so the quality of all products is improved. (PAUL) Intelligent Testing and Assurance

Testing can measure risk Intelligent Assurance and Testing Testing can measure risk Testing is a measurement activity Tests that aim to find faults provide information on the quality of the product which parts of the software are faulty which parts of the software are not faulty Tests help us understand the risk of release Understanding the risks helps us to make a risk-based decision on release After testing, our risk assessment can be refined. (PAUL) Intelligent Testing and Assurance

Intelligent Assurance and Testing The test passes… The risk could be unchanged because: Risk probability higher because: Risk probability lower because: Risk consequence higher because: Risk consequence lower because: (PAUL) Ask the class to think of reasons why, each of the five statements could be true for a test that passes Intelligent Testing and Assurance

Intelligent Assurance and Testing The test fails… The risk could be unchanged because: Risk probability higher because: Risk probability lower because: Risk consequence higher because: Risk consequence lower because: (PAUL) Ask the class to think of reasons why, each of the five statements could be true for a test that fails Intelligent Testing and Assurance

Why use risks to define test objectives? If we focus on risks, we know that bugs relating to the selected mode of failure are bound to be important If we focus on particular bug types, we will probably be more effective at finding those bugs If testers provide evidence that certain failure modes do not occur in a range of test scenarios, we will become more confident that the system will work in production. Intelligent Testing and Assurance

Risks as failure modes or bug types Intelligent Assurance and Testing Risks as failure modes or bug types Risks describe ‘what we don’t want to happen’ Typical modes of failure: calculations don’t work pages don’t integrate performance is poor user experience is uncomfortable Think of them as ‘generic bug types’. (PAUL) Intelligent Testing and Assurance

Defining a test objective from risk Intelligent Assurance and Testing Defining a test objective from risk We ‘turn around’ the failure mode or risk Risk: a BAD thing happens and that’s a problem for us Test objective: demonstrate using a test that the system works without the BAD thing happening The test: execute important user tasks and verify the BAD things don’t happen in a range of scenarios. (PAUL) Intelligent Testing and Assurance

Risk-based test objectives are usually not enough Other test objectives relate to broader issues contractual obligations acceptability of a system to its users demonstrating that all or specified functional or non-functional requirements are met non-negotiable test objectives might relate to mandatory rules imposed by an industry regulatory authority and so on Generic test objectives complete the definition of your test stages. Intelligent Testing and Assurance

Generic test objectives Intelligent Assurance and Testing Generic test objectives (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

Tests as demonstrations Intelligent Assurance and Testing Tests as demonstrations “Demonstrate” is most often used in test objectives Better than “Prove” which implies mathematical certainty (which is impossible) But is the word “demonstrate” too weak? it represents exactly what we will do we provide evidence for others to make a decision we can only run a tiny fraction of tests compared to what is possible so we really are only doing a demonstration of a small, sample number of tests. (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

But tests should aim to locate faults, shouldn't they? The tester’s goal: to locate faults We use boundary tests, extreme values, invalid data, exceptional conditions etc. to expose faults: if we find faults these are fixed and re-tested we are left with tests that were designed to detect faults, some did detect faults, but do so no longer We are left with evidence that the feature works correctly and our test objective is met No conflict between: strategic risk-based test objectives and tactical goal of locating faults. Intelligent Testing and Assurance

Testing and meeting requirements Intelligent Assurance and Testing Testing and meeting requirements Risk-based test objectives do not change the methods of test design much Functional requirements We use formal or informal test design techniques as normal Non-functional requirements Test objectives are often detailed enough to derive specific tests. (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

Goals and Risks and Designing the Test Process Intelligent Assurance and Testing Goals and Risks and Designing the Test Process (PAUL)

Test activities overlay the goal network (not all goals in scope) Test Phase/Activity GOAL RISK Intelligent Testing and Assurance

Risks, deliverables and test types Walkthrough Review Inspect Prototype Early test preparation Requirements HL Design Tech Design Prog. Spec. System Sub-System Code Unit Test Static analysis Integration Test System Test Acceptance Test Goal/Risk Non-functional Security Performance Usability Backup/recovery Failover/restart Volume Stress Etc. etc. Intelligent Testing and Assurance

Risks, objectives and test stages Goal/Risk Test Objective Sub-System Testing Technique System Testing Technique Intelligent Testing and Assurance

From goals/risks to test process Intelligent Assurance and Testing From goals/risks to test process Identify/analyse the goals and risks What evidence must testing provide? Define, for each goal/risk: Objective(s) Object under test Test and coverage model(s), coverage target(s) Entry, exit and acceptance criteria Responsibility, deliverables Environment, tools, techniques, methods Select a test activity to achieve the test objective Collect test activities into stages and align them with the goal network or project plan. (PAUL) Intelligent Testing and Assurance

Intelligent Testing and Assurance Testing Uncertainty Planning relies on predictions of the future but how can you predict test status at a future date? The answer is … you can’t The Testing Uncertainty Principle: One can predict test status, but not when it will be achieved; One can predict when a test will end, but not its status. Intelligent Testing and Assurance

Testing in the real world Intelligent Assurance and Testing Testing in the real world Time and cost limit what can be done Some risks may be deemed acceptable with little or no testing Some goals will be deemed ‘achieved’ without testing at all Items on the plan will be de-scoped to squeeze the plan into the available timescales or budget mark de-scoped line items ‘out of scope’ if someone asks later what happened, you have evidence that the goal/risk was considered, but waived. (PAUL) Intelligent Testing and Assurance

Acceptance Criteria

Intelligent Testing and Assurance Acceptance Criteria Represents the overall readiness to commit to going live considering: The readiness of the solution The readiness of the business Ability to implement (and rollback, if necessary) To live with the difficulties of early days To support the system in it’s early days The need to be compliant Here’s a generic, but comprehensive set of Acceptance Criteria for a Large Programme. Intelligent Testing and Assurance

Level 1 Criteria (example) Steady State Operation Implementation Early Life The Solution (system, process and data) is ready The Organisation (business and IT) is ready We are ready to Implement the solution (and roll back if necessary) We are ready to Support the solution Operational Risks are understood and mitigating actions taken We meet the necessary Regulatory and Compliance requirements Intelligent Testing and Assurance

Level 2 Criteria (example) The Solution (system, process and data) is ready The users have proven that the solution (system and data) supports the business processes. The quality level is high (demonstrated by a low quantity and severity of defects) with agreed workarounds for significant defects. The system performance is sufficient and it is reliable, stable and robust to failure. The Organisation (business and IT) is ready New organisational structures are in place and necessary positions are filled. Sufficient staff have been trained in the new solution and competency has been assessed to be acceptable. Third parties understand the changes and their readiness has been confirmed. Benefit realisation plans are in place. We are ready to Implement the solution (and roll back if necessary) Implementation and roll-back plans have been adequately tested, rehearsed and communicated. Roles, responsibilities and escalation path over the cutover period have been agreed. Temporary business workarounds during the cutover period have been agreed. We are ready to Support the solution Early Life support processes and people are in place. Sufficient transition and handover has been completed with the support and operations groups to enable the solution to be supported in Early Life. Processes and metrics are in place to provide early warning of operational problems during Early Life. Operational Risks are understood and mitigating actions agreed. For the significant operational risks mitigating actions have agreed including, where possible, tested contingency plans. Any residual risk (i.e. not fully mitigated) has been understood and accepted by senior management. We meet the necessary Regulatory and Compliance requirements The necessary regulatory and compliance approvals have been received (Audit, SOX, System Security). Intelligent Testing and Assurance

Test exit criteria = Assumptions A test exit criteria is, in effect, a planning assumption If exit criteria are met on time or earlier, our planning assumptions are sound: We are where we want to be If exit criteria are not met or not met on time, our plan was optimistic: Our plan needs adjustment, or we must relax the criteria What do test exit criteria actually mean? Intelligent Testing and Assurance

Goal, Risk and Coverage-Based Test Reporting

Test Strategy overview Project/Test Phase Reqs Design Build Integ Systest UAT Trial Prod. Test Driver Test Obj. Business goals Objectives for each test phase are easily identified Coverage target Risks Intelligent Testing and Assurance

Risk-based reporting Residual Risks Progress through the test plan Planned end Progress through the test plan Residual Risks start today residual risks of releasing TODAY all risks ‘open’ at the start Intelligent Testing and Assurance

Benefits of risk-based test reporting Risk of release is known: On the day you start and throughout the test phase On the day before testing is squeezed Progress through the test plan brings positive results – risks are checked off, benefits available Pressure: to eliminate risks and for testers to provide evidence that risks are gone We assume the system does not work until we have evidence – “guilty until proven innocent” Reporting is in the language that management and stakeholders understand. Intelligent Testing and Assurance

Goal/Risk based reporting Open Closed Open Risks Closed Closed Open Closed Open Goals/Benefits available for release

Benefits of benefit-based test reporting Risk(s) that block every benefit are known: On the day you start and throughout the test phase Before testing is squeezed Progress through the test plan brings positive results – benefits are delivered Pressure: to eliminate risks and for testers to provide evidence that benefits are delivered We assume that the system has no benefits to deliver until we have evidence Reporting is in the language that management and stakeholders understand. Intelligent Testing and Assurance

Communicating Test Strategies

Based on IEEE Standard 829-1998 IEEE 829 Test Plan Outline Test Plan Identifier Introduction Test Items Features to be Tested Features not to be Tested Approach Item Pass/Fail Criteria Suspension Criteria and Resumption Requirements Test Deliverables Testing Tasks Environmental Needs Responsibilities Staffing and Training Needs Schedule Risks and Contingencies Approvals There is an IEEE standard for a Test Plan. This standard can be applied to the Test Strategy, the Master Test Plan or any of the Phase Test Plans. Based on IEEE Standard 829-1998 Intelligent Testing and Assurance

Intelligent Testing and Assurance I’m no fan of IEEE 829 Used as a strategy checklist Scarily vague (don’t go there) Used as a documentation template/standard Flexible, not prescriptive, but encourages copy and edit mentality (documents that no one reads) But many many testers seek guidance on What to consider in a test strategy Communicating their strategy to stakeholders and project participants Intelligent Testing and Assurance

Intelligent Testing and Assurance IEEE 829 Plan and Axioms Items 1, 2 – Administration Items 3+4+5 – Scope Management, Prioritisation Item 6 – All the Axioms are relevant Items 7+8 – Good-Enough, Value Item 9 – Stakeholder, Value, Confidence Item 10 – All the Axioms are Relevant Item 11 – Environment Item 12 – Stakeholder Item 13 – All the Axioms are Relevant Item 14 – All the Axioms are Relevant Item 15 – Fallibility, Event Item 16 – Stakeholder Axioms Intelligent Testing and Assurance

A Better Test Strategy and Plan Stakeholder Objectives Stakeholder management Goal and risk management Decisions to be made and how (acceptance) How testing will provide confidence and be assessed How scope will be determined Design approach Test phases and sequence Sources of knowledge (bases and oracles) Sources of uncertainty Models to be used for design and coverage Prioritisation approach Delivery approach Test sequencing policy Repeat test policies Environment requirements Information delivery approach Incident management approach Execution and end-game approach Plan (high or low-level) Scope Tasks Responsibilities Schedule Approvals Risks and contingencies Intelligent Testing and Assurance

Reaching your audience It’s all about consensus Strategy must address stakeholders’ concerns Present strategy aligned with those concerns “Their part” of the strategy sets out HOW it addresses those concerns: Will it evidence goal achievement? Does it address the risks? Does it set out MY and OTHERS’ responsibilities? How do MY activities fit into the context of the larger plan? Involve stakeholders in risks workshops, reviews and consult them before presenting your proposal. Intelligent Testing and Assurance

Intelligent Testing and Assurance Audience messages “Business goals and risk” focus: Project Stakeholders, management, users Are my goals being evidenced? Has every risk been identified? Has every risk been addressed? Is the right group addressing the risk? Will tests address MY concerns? Intelligent Testing and Assurance

Intelligent Testing and Assurance Audience messages 2 “Contractual aspects” focus: Software suppliers and Contract people Does the strategy match the contract? Are test activities aligned with stage payments? Is the strategy fair and allow me to be paid? Does the strategy impose the right level of test activity on the supplier? How will testing demonstrate we get what we want? Intelligent Testing and Assurance

Intelligent Testing and Assurance Audience messages 3 “Risks and responsibilities” focus: Suppliers, Developers, System Test Team, UAT Team Do I know what I have to do in my testing? Who covers the things I don’t cover? When do I start testing? When do I stop? What evidence do I need to provide to address the stakeholder risks? How do I report progress? Intelligent Testing and Assurance

Intelligent Testing and Assurance Audience messages 4 “Meeting business and technical requirements” focus: Business analysts, Technical Architects, Operations, Technical Support How will testing show the architecture “works”? How will testing give me confidence the system “works”? How will testing demonstrate the system can be operated and supported? Is the system ready to be deployed? Intelligent Testing and Assurance

Communicating with the audience Presentations Appropriate for management teams or larger groups Q&A can stay at a high level Walkthroughs Smaller groups get more involved in the detail In person Involve individuals in workshops, reviews, handover Focus on the audience messages regardless of the medium. Intelligent Testing and Assurance

Closing Comments

Intelligent test strategy Helps stakeholders: They get more involved and buy-in The have better visibility of the test process Helps testers Clear guidance on the focus of testing Approval to test against risks in scope Approval to not test against risks out of scope Clearer test objectives upon which to design tests. Intelligent Testing and Assurance

Intelligent test execution and reporting Helps stakeholders: They have better visibility of the results available and the risks that block results Helps management: To see progress in terms of risks addressed and results that are available for delivery To manage the risks that block acceptance To better make the release decision Helps project managers, helps testers. Intelligent Testing and Assurance

We work for the operator for a national lottery The Case Study We work for the operator for a national lottery

We’ll work through several exercises Introduction to the project (30-40m) Regulatory framework and stakeholders Stakeholder objectives/benefits and concerns The lottery process (more risks) (20m) Project Goal network and Test Process (20m) Test Phase Responsibilities (20m) Test Phase Definition (20m) What’s Left? Intelligent Testing and Assurance

If the information you need is not provided – make it up If there is an unknown – don’t get stuck In the team, think of a possible scenario and assume that There won’t be time for me to invent answers and explain … and your ideas are probably more interesting  Intelligent Testing and Assurance

Introduction to the project (30-40 minutes) On your own: Read the Case Study overview (page 10-11 ONLY) Read the regulatory framework Learn about the stakeholders Identify 1-2 goals and 3-4 risks As a TEAM, discuss your goals and risks List them in your workbook page 12 If you need more space, use page 25 onwards… Intelligent Testing and Assurance

Intelligent Testing and Assurance Introduction retro Did everyone find the same goals/risks? Is it hard to find goals? Is it hard to find risks? Should these goals/risks be in scope for testing? Which ones aren’t suitable for testing? Does it depend on how you define testing? Get your scope right/agreed! Intelligent Testing and Assurance

The lottery process (20 mins) Read through the lottery process stage by stage (page 13-15 ONLY) Each stage has a short narrative Can you think of any new risks? Add them to the table on page 12 Intelligent Testing and Assurance

Intelligent Testing and Assurance Lottery process retro How many risks do you think are there to be documented? How many are in scope for testing? Who has the most risks of concern? For each risk – is there a single mitigation or more than one? Intelligent Testing and Assurance

The Project Goal network and The Test Process (20 mins) The project goal network on page 16 is an outline of the goals of the project Read and discuss the diagram as a team Some test activities are shown, but are merged with development activities Mark up the Project Goal network with your test phases or… Use page 19 to sketch out the network of goals and test phases that you think you need. Intelligent Testing and Assurance

Intelligent Testing and Assurance Project Goals retro Did having the system schematic help? Each test phase measures achievement Each test phase is associated with a project goal, activity or deliverable If a test phase ‘passes’ then the goal is achieved Does exiting a test phase confirm that a project goal has been achieved? Intelligent Testing and Assurance

Test stage responsibilities (RACI/RASCI chart) (10 mins) For each test phase, define some objectives (goals to demonstrate or risks to be mitigated) and assign responsibilities Intelligent Testing and Assurance

Test Phase Definition (all day) Now you have a much clearer understanding of the goals and risks You have identified a set of test stages and begun to assign goals/risks and responsibilities Read page 20 guidelines for completing the full test stage definitions Create a full definition of one of your test stages (I suggest a system or integration test stage to start) Intelligent Testing and Assurance

What’s Left/Missing? Lots!

Intelligent Testing and Assurance What’s missing? No requirements! Do you need them (yet)? No discussion of what the sources of knowledge (basis and oracles) are or their quality/usefulness No discussion of test models How do we identify things to test? What coverage targets could be used; can we define them; can we measure them? No discussion of our capabilities (the operator) and the supplier’s No mention of environments or technology Intelligent Testing and Assurance

Intelligent Testing and Assurance What’s missing 2 No discussion of acceptance criteria (consider slides 73-77 etc.) No discussion of what evidence stakeholders actually require and in what format No mention of the incident process No discussion of test execution tools (developer or system level) No discussion of test design/record-keeping And lots more, I know… Intelligent Testing and Assurance

Have we addressed all of your concerns? Email me your test strategy questions – answers are often good ideas for blogs

Or… ask us to run a test strategy workshop at your site Thank You!

How to Create a Test Strategy Intelligent Assurance and Testing 13-Apr-17 How to Create a Test Strategy @paul_gerrard Paul Gerrard paul@gerrardconsulting.com Web: gerrardconsulting.com