Download presentation
Presentation is loading. Please wait.
Published byJemima Muriel Stephens Modified over 7 years ago
1
Testing in Regulated Industry Environment
Iryna Kozachuk
2
Regulated Industry Environments
Financial Institutions (Banks…) Marine shipping, ferry and port services Air, railway and road transportation (including airports, tunnels, bridges, etc) Food and drugs Etc. I. Kozachuk
3
Not Regulated Industry Environment
Marketing Engineering Architecture Design Implementation Testing Review Test Case creation I. Kozachuk
4
Regulated Industry Environment
Quality Regulatory Marketing Engineering Architecture Design Implementation Testing Review Test Case creation I. Kozachuk
5
Standards: General ISO 9000 IEEE PCI compliance CFR part 11
Family of standards are related to quality management systems and designed to help organizations ensure that they meet the needs of customers and other stakeholders IEEE , also known as the 829 Standard for Software and System Test Documentation, standard that specifies the form of a set of documents for use in eight defined stages of software testing, each stage potentially producing its own separate type of document. ISO 9000 IEEE PCI compliance CFR part 11 Note: Cyber security standards are covered by ISO15408 The Payment Card Industry Data Security Standard (PCI DSS) FDA: electronic record and signature IEE: Institute of Electrical and Electronics engineering I. Kozachuk
6
Summary of ISO 9001:2008 in informal language
The quality policy is understood and followed at all levels and by all employees. The business makes decisions about the quality system based on recorded data. The business determines customer requirements. When developing new products, the business plans the stages of development, with appropriate testing at each stage. It tests and documents whether the product meets design requirements, regulatory requirements, and user needs. The business regularly reviews performance through internal audits and meetings. Records show how and where raw materials and products were processed to allow products and problems to be traced to the source. I. Kozachuk
7
IEEE 829 Test Plan: (How; Who; What; How Long; What the Coverage)
Test Design Specification: detailing test conditions and the expected results as well as test pass criteria. Test Case Specification: specifying the test data for use in running the test conditions identified in the Test Design Specification Test Procedure Specification: detailing how to run each test, including any set-up preconditions and the steps that need to be followed Test Item Transmittal Report: reporting on when tested software components have progressed from one stage of testing to the next I. Kozachuk
8
IEEE 829 (cont’d) Test Log: recording which tests cases were run, who ran them, in what order, and whether each test passed or failed Test Incident Report: (defect) Test Summary Report: A report providing any important information uncovered by the tests accomplished, and including assessments of the quality of the testing effort, the quality of the software system under test, and statistics derived from Incident Reports. The report also records what testing was done and how long it took, in order to improve any future test planning. This final document is used to indicate whether the software system under test is fit for purpose according to whether or not it has met acceptance criteria defined by project stakeholders I. Kozachuk
9
FDA:CFR part 11 Title 21 CFR Part 11 of the Code of Federal Regulations deals with the Food and Drug Administration (FDA) guidelines on electronic records and electronic signatures in the United States. Part 11, as it is commonly called, defines the criteria under which electronic records and electronic signatures are considered to be trustworthy, reliable and equivalent to paper records I. Kozachuk
10
Standards: SOP/DP –company specific
Subsystem Requirements Subsystem Verification Design Trace Matrix Media Creation Process Defect Tracking Inspection Procedure Software Verification Version Numbering Software Configuration Management Process Unit and Integration Testing Software Design Reviews , etc I. Kozachuk
11
Engineering Process I. Kozachuk
12
Concept: Draft of Software Architecture
Preliminary schedule SW and testing I. Kozachuk
13
Definition: SWE Development Plan Algorithm Plan
Configuration Management Plan Requirements (SW and Algorithm) Build Requirements Verification Plan Trace Matrix (draft) Risk Analyses (Draft) I. Kozachuk
14
Analyses and Design: SW Design Algorithm Design Verification Design
I. Kozachuk
15
Implementation: Functionally Complete SW for iteration. Unit test
Code inspections Verification Procedures I. Kozachuk
16
Integration testing: Selected Verification Procedures
Dry run of procedures Exploratory testing SMART. Unit test I. Kozachuk
17
Integration testing: All Verification Procedures executed
All Defects verified Exploratory testing SMART. Unit test passed Verification Log updated Summary Report generated I. Kozachuk
18
Integration testing: ReadMe Release Notes System testing
Validation testing Traceability Matrix Risk Analyses Doc Golden Master I. Kozachuk
19
Software Quality Assurance Activities:
Review: Software Requirements and/or Functional Specification, Technical Development Documents/ SWE Designs Creation: Test Plan and test designs Requirements based Test Cases, Test Suits/Procedures/Protocols End-to-End procedures Peer review of the Test Cases/ Procedures Functional, Exploratory, JIT and Integration Testing I. Kozachuk
20
Software Quality Assurance Activities (cont’d)
Builds qualification Procedures execution Updating the Test Log Updating the Traceability Matrix Generation of the Summary Report Defects submission/Verification/Closure I. Kozachuk
21
Requirement Review: Types of requirements
WHY (Business) WHAT (Functional) HOW WELL (Qualities, non functional, robustness, compatibility, reliability, safety, etc) HOW (Design) I. Kozachuk
22
Requirement: IEEE Guide to the Software Engineering Body of Knowledge
‘It is certainly a myth that requirements are even perfectly understood or perfectly specified. Instead, requirements typically iterate toward a level of quality and detail that is sufficient to permit design and procurement decisions to be made’ I. Kozachuk
23
Software Requirements Specification (SRS)
Is a complete description of the System to be Developed. Documents that SRS includes: Functional requirements. Non-functional requirements Requirements are the foundation of any development project I. Kozachuk
24
The Reality Writing down requirements takes a long time
Developers don’t read the requirements Requirements changed though the project I. Kozachuk
25
Cost of Requirement defects
Requirements defects cost a lot to fix downstream I. Kozachuk
26
The Good enough criteria:
Sufficiently complete Feedback from stakeholders Meets goodness checks Setting scale for non-functional requirements Progressive elaboration Develop thoroughly in steps and continuously by increments Enough information to answer the question at hand Do requirement work TO THAT LEVEL I. Kozachuk
27
Defining a Good Requirement
Correct (technically) Complete (express a whole idea or statement) Clear (unambiguous and not confusing) Consistent (not in conflict with other requirements) Traceable ( uniquely identified) I. Kozachuk
28
Goodness Checks: I. Kozachuk
29
Examples: The "Anny had a little lamb"
Statement In contrast to Anny had a little lamb … it was hers, not someone else’s Anny had a little lamb … but she doesn’t have it anymore Anny had a little lamb … just one, not several Anny had a little lamb … it was very, very small Anny had a little lamb … not a goat, a chicken, etc Anny had a little lamb … but John still has his Progressive elaboration Develop thoroughly in steps and continuously by increments Enough information to answer the question at hand Do requirement work TO THAT LEVEL I. Kozachuk
30
Setting scale: Progressive elaboration
Develop thoroughly in steps and continuously by increments Enough information to answer the question at hand Do requirement work TO THAT LEVEL I. Kozachuk
31
Writing Test Cases based on the Requirements and Walking on the Water…
Both Easy When Frozen… I. Kozachuk
32
Regulated: Required Feedback
Regulatory Legal Quality Tech Publication System Verification Validation Medical SQE, etc I. Kozachuk
33
Example: Requirement AA-UM-77: Application shall provide predefined Service and Administrator user accounts. AA-UM-78 Administrator password can be modified by Service AA-UM-472: A user with administrative privileges has the ability to generate a one time use password that user needs to change upon first user login. I. Kozachuk
34
Example: Functional Specification
The Default Administrator account will have User ID as ‘Administrator’ and Password as ‘administrator’. This password can be changed through change password screen. The ‘Service’ account will have User ID as Service and Password as ‘MyService’. This password for Service account cannot be modified. Password expiration for these accounts should set NEVER I. Kozachuk
35
Example: Software Design Specification
I. Kozachuk
36
Test Plan Outline (?) I. Kozachuk
37
Test Plan Outline Purpose Scope Reference Definition, Abbreviations
Feature To Be Tested Feature Not To Be Tested Testing Strategy (Functional and Non-Functional); Build Evaluation Strategy Naming Convention Acceptance and Suspension (Mitigation) Criteria Equipment/Environmental Needs Roles and Responsibilities; Staffing and Training Schedule Change History I. Kozachuk
38
Test Plan Outline (cont’d)
Definition, Abbreviations –examples: DHF Design History File DOORS Dynamic Object-Oriented Requirements System DODR Design Output Design Review FCAT Feature Complete Acceptance Testing FTA Fault Tree Analysis FTE Full Time Employee ID Identifier IVD In Vitro Diagnostic LIS Laboratory Information System SMART Software Manual Acceptance and Regression Test SRS Software Requirements Specifications <Application name v. NN> I. Kozachuk
39
Test Plan Outline (cont’d)
Naming Convention –examples: Each Software Verification Design will be uniquely identified as follows: FS-XX where: AA is the acronym for Application under test; XX is the appropriate Application functional area acronym (e.g., CM – Common). The following Verification Procedure naming convention will be used: AA-XXYY – where XX is inherited from corresponding Verification Design. YY – is sequential number. The following naming convention will be used as appropriate to indicate Verification Procedures that cover complex functionality divided in parts for ease of execution: AA-XXYY_N, where N is 1, 2, etc. I. Kozachuk
40
Test Plan Outline (cont’d)
Build qualification- example: Each build candidate for official execution of Software Verification Procedures during FCAT, System Verification, Customer Evaluations, and System Validation will be evaluated by running Build Acceptance Procedure(s) to verify basic functionality. SMART testing includes documented exploratory testing and may include additional regression cases based on defect fixes and areas affected by these defects as determined by an impact analysis. Only Smoke procedures will be executed if a build is planned to be used for defect verification only. I. Kozachuk
41
Test Design Outline (for complex application if verification plan could not include all specifics)
Features To Be Tested Reference Definitions, Abbreviations Testing Approach Table of Planned Protocols/Procedures/Cases with Defined Identifiers Could be organized by Functional Areas I. Kozachuk
42
Test Design Outline (cont’d)
List of requirements Features To Be Tested Reference Definitions, Abbreviations Testing Approach Table of Planned Table of Planned Protocols/Procedures/Cases with Defined Identifiers Could be organized by Functional Areas SRS including version Functional/UI specification Specific for this designs and not included into Verification Plan I. Kozachuk
43
Test Design Outline (Testing Approach example)
Statistic results accuracy will be verified by using artificial files Verification of Common functionalities of application will be performed in Stand Alone or Instrument Connected mode. To insure the appropriate coverage the Equivalence Class Partitioning techniques are used. Equivalence Classes will be defined by SQE based on known differences between modes and risk analysis and approved by Software Developers during procedure review and sign-off. I. Kozachuk
44
Test Design Outline (Testing Approach example)
Requirement AA-CR-62 will be verified by creating maximum number of reports based on maximum number of worksheets allowed in XYZ software as specified in XYZ Software Version 5.0 GW Worksheets Software Functional Specification. Headers and footers will be verified by using 60 characters in each section with total number of characters in header and footer not exceeding a total of 150 characters. I. Kozachuk
45
Test Case Outline (?) I. Kozachuk
46
Test Procedure Outline
Test Execution Information (computer configuration, Browser, etc) Purpose Reference Preconditions / Special Requirements Acceptance Criteria for the Verification Results Some companies include Expected Result / Output Values – what you are supposed to get from application and Actual Result – what you really get from application I. Kozachuk
47
Example of the Test Document template
I. Kozachuk
48
Example of the Test Document template
I. Kozachuk
49
Example of the case in Procedure
I. Kozachuk
50
Example of the case in Verification Procedure (cont’d)
I. Kozachuk
51
Test Procedure for complex functionality
I. Kozachuk
52
End-to-End procedure End-to-end procedure validate whether the flow of application from the starting point to end point is happening as expected For example, while testing a web page: the start point will be logging in to the page and the end point will be logging out of the application. In this case, the end to end scenario will be log in to the application, get into inbox, open and close the mail, compose a mail, either reply or forward the mail, check in the sent items, and log out . Note: System Testing is the methodology to validate whether the system as a whole is performing as per the requirements. There will be a lot of modules or unit in a application and in the system testing we have to validate the application performs as per the requirements as a whole. I. Kozachuk
53
Conducted per Verification Plan
Build qualification Conducted per Verification Plan Execution of defined procedures/scripts Regression testing. SMART Testing around defect fixes using Release notes (Release notes are generated for each build). I. Kozachuk
54
Dry run during feature implementation. Execution during Final testing
Procedure execution Dry run during feature implementation. Execution during Final testing I. Kozachuk
55
Testing vs. Test Log Functional Exploratory JIT (Just in Time)
Integration Testing I. Kozachuk
56
Testing vs. Test Log Functional Exploratory JIT (Just in Time)
Integration Testing I. Kozachuk
57
Test Log Format (?) I. Kozachuk
58
Test Log Format I. Kozachuk
59
Test Log Format (example)
I. Kozachuk
60
Summary Report Purpose Scope References
Definitions, Abbreviations, and Acronyms Summary Variances Summary of activities Summary of results Evaluation Change History Addendum I. Kozachuk
61
Summary Report Summary SUMMARY OF ACTIVITIES Builds evaluation
List of Builds Final build SUMMARY OF ACTIVITIES Used instruments/ browsers, OS etc % of each component (e.g. what was executed on which computer type or browser) % of SQE team involvement. I. Kozachuk
62
Summary Report Summary of Results
All defects submitted by SQE planned for this release in either fixed, or marked as Non reproducible and were verified by SQE; All verified defects are closed All defects found during Feature Complete Acceptance testing are submitted into Defect Tracing database and included in the Verification Log Addendum A to this document. The following abbreviation is used in the column ‘Result’ of the Verification Log Addendum A: If the defect is verified and closed, ‘C’ is listed next to it. Assigned, High priority with Planed Version not 1.0.2: ‘H-FR’ Work Completed with Planed Version not 1.0.2: ‘WC-FR’ Assigned, medium or low: ‘A-M’ or ‘A-L’ I. Kozachuk
63
Evaluation The build X.Y.Z.KLMN (Firmware version XYZ) was evaluated by execution of all defined build qualification procedures and additional exploratory testing and it was defined as the Final candidate#1 the release. All defects submitted by SQE planned for this release in either fixed, or marked as Non reproducible and were verified by SQE; I. Kozachuk
64
Bug Activities I. Kozachuk
65
Not a myth Does not happen on developers machine
Disappears when you show it to others! I. Kozachuk
66
Example (?) When “Percent of Completion (%)” is specified, the software may require the user to specify “Percent of the Individual Contribution” as well. I. Kozachuk
67
Bug Report (?) I. Kozachuk
68
Example: Copy (?) “Select region by holding on ‘Shift’ or ‘Ctrl’ key while clicking on another region. Chose ‘Copy’ from the contextual menu or the ‘Copy’ in the Edit menu of the application Menu Bar, or press the ‘Ctrl’+C” I. Kozachuk
69
Bug Report (?) I. Kozachuk
70
Example :Copy “Select region or select multiple regions by holding on ‘Shift’ or ‘Ctrl’ key while clicking on another region. Chose ‘Copy’ from the contextual menu or the ‘Copy’ in the Edit menu of the application Menu Bar, or press the Ctrl+C” I. Kozachuk
71
Example - Log In (?) A Log In screen is displayed at the launch of the application. Users will be prompted to choose a username from the User Name list and type the password. The password is masked so that it cannot be read on the screen. The User Name list is ordered. I. Kozachuk
72
Bug Report (?) I. Kozachuk
73
Example - Log In A Log In screen is displayed at the launch of the application. Users will be prompted to choose a username from the User Name list and type the password. The password is masked so that it cannot be read on the screen. The User Name list is ordered by User Account creation. I. Kozachuk
74
Bug Report (?) I. Kozachuk
75
Example - Log In A Log In screen is displayed at the launch of the application. Users will be prompted to choose a username from the User Name list and type the password. The password is masked so that it cannot be read on the screen. The User Name list is ordered alphabetically. I. Kozachuk
76
Surprise… I. Kozachuk
77
Listening to Your Defects
Defects can tell us a lot about… Our Projects Our Product Our Process I. Kozachuk
78
What Defects Say about Your Project
Key exit criteria for testing (and release criteria) Defect matrix Are we done finding defects Have the important bugs been resolved Our Process Timely entry of defect data is essential to generate accurate report and make thoughtful decisions I. Kozachuk
79
I. Kozachuk
80
What Defects Say about Your Product
Defects can tell a lot about your product What is the level of quality risk Where are the defect clusters Which areas are the most fragile Our Process Timely entry of defect data is essential to Generate accurate report and make thoughtful decisions Product improvement decision I. Kozachuk
81
Quality Risk I. Kozachuk
82
What Defects Say about Your Process
Software engineering is a human process Did developers create robust Unit test Where are the defect clusters failed Unit tests Why build so many times failed Potential for Process improvement I. Kozachuk
83
Types of Testing Black-Box testing: based on a Product Documentation (Called behavioral, functional) Structural testing: based on object's structure – code (Called structural, glass-box, white-box, unit) Configuration testing: hardware environments, installation options Compatibility testing: forward & backward compatible, standards & guidelines Usability testing Document testing: user manual, advertising, error message, help, installation instructions Localization (L10n) testing: locals testing, etc. I. Kozachuk
84
Models & Processes Structured ( Waterfall) Iterative/Incremental
Spiral Agile & Scrum RAD XP I. Kozachuk
85
Waterfall I. Kozachuk
86
V-Model The V-Model can be presumed to be the extension of the Waterfall Model. The V-Model demonstrates the relationships between each phase of the development life cycle and its associated phase of Testing. I. Kozachuk
87
V-Model (Cont’d) I. Kozachuk
88
Science Application Validation Components:
Hardware Software Reagent System (optics, fluidics, etc) I. Kozachuk
89
V-Model (Instrument Based Application)
I. Kozachuk
90
Software Development Process: for Instrument Based Software
System Verification (verification of the integration of instrument, reagents, and software application; Risk Analysis Mitigation, Stress and Hazard, Characterization) - Have we built the software right System Validation (validation of user defined scenarios, creation custom template, etc) - Have we built the right software I. Kozachuk
91
"Incremental" and "Iterative"
Various parts of the system are developed at different times or rates, and integrated as they are completed The basic idea is to develop a software system incrementally, allowing the developer to take advantage of what was being learned during the development of earlier, incremental, deliverable versions of the system. I. Kozachuk
92
"Incremental" and "Iterative" (Cont'd)
Key steps in the process were to start with a simple implementation of a subset of the software requirements and iteratively enhance the evolving sequence of versions until the full system is implemented. At each iteration, design modifications are made and new functional capabilities are added. I. Kozachuk
93
"Incremental" and "Iterative" (Cont'd)
I. Kozachuk
94
Spiral Model The Spiral Model is combining elements of both design and prototyping-in-stages. The Spiral Model is intended for large, expensive and complicated projects I. Kozachuk
95
Spiral Model (Cont'd) I. Kozachuk
96
Problems with previous methodologies
1) Releasing Applications takes long time that may be resulting in inadequate, outdated or even unusable system. 2) The assumption that “Requirement Analysis” phase would identify all the critical requirements. I. Kozachuk
97
Agile Model Iterations in short amounts of time (1-4 weeks)
Each iteration passes through a full software development cycle: including planning , requirement analysis, design, coding, testing, and documentation. I. Kozachuk
98
List of Agile Methods Scrum Test Driven Development
Feature Driven Development I. Kozachuk
99
Scrum Test Driven Development Feature Driven Development Scrum
I. Kozachuk
100
Scrum (Cont'd) The set of features that go into each sprint come from the product backlog. During the sprint A daily project status meeting occurs (called a scrum or “the daily standup”). No one is able to change the sprint backlog, which means that the requirements are frozen for sprint. I. Kozachuk
101
User Stories I. Kozachuk
102
Acceptance Criteria I. Kozachuk
103
User Stories become Tasks
Design tube duplication… Produce written design Get it peer reviewed Write unit tests against stubbed out implementation As a research user I want to duplicate an existing tube without data to acquire another tube just like it. Implement design Write Automated Acceptance Tests. I. Kozachuk
104
I. Kozachuk
105
Builds Continuous (sub set of unit tests) Nightly (includes some UI)
Comprehensive (up to 12 hours- intensive UI) Penalty box (broken build-notification send to the person responsible for the test) I. Kozachuk
106
Test Driven Development
This technique consists of short iterations where new test cases are written first. The availability of tests before actual development ensures rapid feedback. I. Kozachuk
107
Feature Driven Development
High-Level Walkthrough Of The Scope Feature List Plan By Feature Design By Feature Build By Feature Milestone I. Kozachuk
108
Rapid Application Development (RAD)
Involves iterative development and the construction of prototypes Prototypes (Mock-ups) allow users to visualize an application that hasn't yet been constructed. Prototypes help “stakeholders” making design decisions without waiting for the system to be built. I. Kozachuk
109
Extreme Programming (XP)
Form of Agile Software Development: Ongoing changes to the Requirement Focus on designing and coding for the current needs instead of future needs. Not spending resources on something that might not be needed. (uncertain future requirements) I. Kozachuk
110
XP (Cont’d) System is developed in increments so can quickly be modified in response to end-user and customer feedback. Mostly evolutionary-prototyping efforts can start with any high-risk area. I. Kozachuk
111
The Role of Testing in Agile
Testing is the headlights of the project Where are you now? Where did you head? Testing provides information to the team This allows the team to make informed decisions A “bug” is anything that could bug a user Testers don’t make the final call Testing does not assure quality The team does (or doesn’t) Testing is not a game of “gotcha” Find ways to set goals, rather than focus on mistakes I. Kozachuk
112
Acceptance Testing in Agile
User stories are short descriptions of features that need to be coded. Acceptance tests verify the completion of user stories. Ideally they are written before coding I. Kozachuk
113
A way of thinking about Acceptance Tests
Turn user stories into tests. Tests provide: Goals and Guidance Instant Feedback Progress Measurement Tests are specified in a format: That is clear enough that Users / Customers can understand it That is specific enough so it can be executed Specification by Example I. Kozachuk
114
Agile: Exploratory Learning
Plan to explore the product with each iteration. Look for bugs, missing features and opportunities for improvement. We don’t understand software until we have used it. I. Kozachuk
115
Value individuals and interactions over processes and tools –in practice
Automation testing - to confirm that code works as expected (unit tests in the hands of the developers to satisfy Dev acceptance criteria) Manual testing- confirmatory tests, with close observation for undesirable behaviors (focused acceptance testing driven by the customers/stories to satisfy Customer acceptance criteria ) Note: The product will be judged by the customer typically by manual testing Testing by testers is often driven by the need to measure the system’s performance and to find surprises – tools are very much in evidence, but rigid test scripts and procedures do not give the requisite opportunity for discovery, diagnosis and exploitation. I. Kozachuk
116
Regulatory: Traditional vs. Agile
In the ideal world they would have a ‘finished’ product to verify against a finished specification. In Agile to validate a moving target against a changing backlog I. Kozachuk
117
Finished Device - Agile vs. FDA
Software validation is a part of design validation of the finished device (Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices ) Payoff for the effort: validation becomes integral to development Payoff for the validation: re-factoring Payoff for SCRUM: less defects in final product I. Kozachuk
118
Traditional Test Approach
I. Kozachuk
119
Common Agile Testing Approach
Traditional testing is expensive and often makes validation hard to do. On the other hand, the common Agile testing approach is going to make documenting verification activities difficult Set of test cases; checklist; SMART. Draft of the procedure; update of previously created cases/procedure. VP Sign-of for completed feature Within Iteration I. Kozachuk
120
Regulated Agile Testing Approach
“Test plans”-set of test cases; check list; SMART for work within an iteration (and in sequential iterations). Includes cases for currently-interesting areas and across the system to provide feedback for ongoing stability. SCM diff report, reviewed/signed by developers I. Kozachuk
121
SQE Workflow From Sprints FCAT
Feature Backlog New Functionality New Functionality New Functionality FCAT Sprint N Sprint N+1 Sprint N+2 SQE Technical Debt Existing Functionality Existing Functionality Existing Functionality I. Kozachuk
122
FS Backlog: preparation for FCAT
Sprint N Sprint N Sprint N Sprint N+1 Epic1 Unit Test1 Unit Test1 Unit Test1 Unit Test1 Unit Test N Unit Test N Unit Test N+M Story1 Story2 VPs Test case 1.1 Test case 2.1 Test case 1.2 Test case 2.2 Test case 1.3 Verification Procedure draft Automation Scripts I. Kozachuk
123
Integration testing of Sprint N-1 during Sprint N
Team 1 SQE 1 New Feature Dev. (N) SQE Tech Debt (N) New Feature Dev. (N+1) SQE Tech Debt (N+1) New Feature Dev. (N+2) SQE Tech Debt (N+2) Team 2 SQE 2 New Feature Dev. (N) SQE Tech Debt (N) New Feature Dev. (N+1) SQE Tech Debt (N+1) New Feature Dev. (N+2) SQE Tech Debt (N+2) Team 3 SQE 3 New Feature Dev. (N) SQE Tech Debt (N) New Feature Dev. (N+1) SQE Tech Debt (N+1) New Feature Dev. (N+2) SQE Tech Debt (N+2) Integration Scrum 3 Days Testing 3 Days Testing I. Kozachuk
124
Stabilization Sprint Stabilization Sprint I. Kozachuk
125
FCAT:Option1:One FCAT Pros: Code is completed I. Kozachuk
126
Multiple FCATs: Same Size
I. Kozachuk
127
Multiple FCATs: Diff size
I. Kozachuk
128
Thank You Questions? I. Kozachuk
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.