Download presentation
Presentation is loading. Please wait.
1
Soft Testing at Mayo Clinic
We have been working on Soft installations, and testing Soft, for over 10 years. Mark Welter Software Testing Manager, Department of Information Technology
2
Mayo Clinic 4,729 physicians and scientists and 58,405 allied health staff Campuses in Rochester, MN; Jacksonville, FL; and Phoenix/Scottsdale, AZ Serve more than 60 communities through Mayo Clinic Health System Extends knowledge and expertise to physicians and providers through Mayo Clinic Care Network Collectively, care for more than 1.3 million patients each year Addt. stats/info can be found:
3
Department of Information Technology
Staff of 1700+ Serves all Mayo Clinic entities at all locations 57,000+ workstations 7800+ servers
4
Division of Laboratory Pathology and Extramural Applications (LPEA) – Software Quality Assurance
SQA Staff of 51 3 Software Quality Analysts 14 Software Testers 7 Laboratory System Interface Testers 5 Test Automation Developers 22 Managed Service Testers Supports all Department of Laboratory Medicine and Pathology applications (215+) Also lead and support testing in other departments through the Mayo IT Testing Center of Excellence
5
Mayo / SCC Development Partnership
Current model Implement multiple releases a year Have averaged over 100 deliveries of code into our Soft environments, per year, for the past 4 years SCR / Customization intensive Work with Soft as they have developed new modules Manage 10 Soft environments at Mayo and now have one environment at Soft New model Transition to a maintenance client Two planned GA releases a year Limited SCR/Customizations Telling you how we Test Soft, not because this is the only way, or the best way, but to see if there are any additional ideas for how you can do your soft testing. Mention hot fixes?
6
Verification vs. Validation
At Mayo we break testing down into two basic concepts: Verification Did we build the system correctly? (per defined specifications) Validation Did we build the correct system? (per user expectations)
7
Terminology: Verification / Validation
Are we building the product right? Does the software meet specifications… Validation Are we building the right product? Can the staff use the system as built…
8
Types of Testing System Verification Coordinated Activities
Functional Interface Performance Security Coordinated Activities Infrastructure Verification (OS & Client IO, Back-up & Recovery) Instrument Verification Parameter Set-up Verification File Build Verification Full Path Verification External Client Testing System End-to-End Testing Validation Regulatory Compliance Validation (Patient reports, RD, Billing) Process Validation / User Acceptance Testing (UAT)
9
Multi-faceted Approach Evaluating Health: Software
Exercise the functions Business rules & triggers Simulate interfaces [data transfers] Modules, Systems, Instruments Simulate traffic loads Compare tables settings Review deliverables Patient Reports Management Reports Labels Observe user / system interactions Compare database table settings File build setup between SCC environments (dev/test/Prod) File build setup between systems (ie lab to EMR) Build verification scripts Soft test setup Workstation setup V&V activities, combined with training & competency testing, together will assist us in achieving system confidence.
10
Risk Based Testing There is not just one path for generating results
It’s naïve to think that you can test everything. There is not just one path for generating results Different permutations, instruments, sites, EMRs, etc. There is more than one way that a test can be resulted Each of these variations evokes different flows and user processes Development / set-up is done simultaneously (code, test definition, instruments, interfaces, process) At Mayo, no application is an island – there are coordinated changes that must occur across applications Both system and human behavior is altered with volume Change Control: knowledge of changes and impact analysis is crucial
11
User Acceptance Testing (UAT)
Why do User Acceptance Testing? UAT captures whether the product/system meets the end user’s needs prior to putting new code into production End users find different defects than testers do History has proven that failure to perform UAT frequently results in significant and often urgent issues in production Who performs UAT? Information Management Techs in each lab Super Users from each lab End users perform UAT
12
User Acceptance Testing (UAT) – When?
When is UAT performed? Before go-live In the test environment, on the new code version Coordinated with software and interface verification Coordinated with new test and instrument verification UAT needs to be as ‘real’ as possible to catch as many issues prior to go-live Tools All labs document their UAT in a Test Management Tool Allows for standardization, monitoring of progress, trending, logging of issues, status reporting End users perform UAT One to many rounds of UAT are performed depending on the size of the release.
13
User Acceptance Testing (UAT) - Workflows
End users exercise their pre, post and analytical workflows for UAT Includes connected instruments Focus on the core analytic processes first to ensure that those workflows still work as expected with the new code version Choose permutations to exercise for your high volume and/or complex analytic workflows End users perform UAT
14
Testing Process Prepare Test Plan Test Case Development
Perform Software Testing Create Test Summary Report Release Readiness Review
15
Prepare Test Plan A Test Plan document describes the scope, approach, resources and schedule of intended testing activities. Test Scope Test Approach Project Risks & Assumptions Test Environment Test Controls Planned Test Suites / Test Cases Development / Test Procedures Test Plan Approvals One of the most important testing documents Every plan is formally reviewed by a lead or mgr
16
Test Case Development A Test Case is a set of execution preconditions (setup), inputs and expected outcomes developed for a particular objective which traces back to the requirement(s) Contains the list of actions and expected results to verify if a function is working as designed
17
Perform Software Testing
The records kept during test execution are vital for supporting system V&V Test records must be sufficient enough to allow for test reconstruction and identifies: Tester Test date Test environment Test steps Test data exercised Acceptance criteria Test results (pass/fail) Defect/bug disposition We use testing tools like TFS to capture our run records and the associated meta data
18
Create Test Summary Report
The Test Summary Report summarizes the V&V activity and is the basis for submission for approval to implement The report includes: Summary statement of the testing outcomes Any testing variances from the plan Traceability Matrix All open release defects (includes deficiencies that have been accepted "as is" or with a workaround) Test Metrics
19
Release Readiness Review
Record Center Review & Approve Implementation Test Test Plan Plan Summary TFS High Risk Changes Change Approval Board (CAB) Release Readiness Review / Release Readiness Board Confirm Content & Obtain Stakeholder Approvals Participants Release Manager, Development Representative, Test Representative, Deployment Manager, Business Representative, Support Representative Tool Support Structure Define Release Managers and Applications Release Managers defined as a RRB Implement
20
Takeaways: How might this apply to you?
Evaluate minimum coverage based on Risk Risk based testing is the real answer Not enough resources or time to test “everything” Areas to consider: File build verification (Mandatory) Regulatory Compliance Validation (Patient reports, RD, Billing) (Mandatory) End-to-End (Strongly recommended) User Acceptance Testing (Strongly recommended)
21
Questions?
22
Testing Activities Performed by SCC Soft Computer
Kenton Smith Director, Quality Management SCC Soft Computer
23
FDA Establishment Registration
24
FDA Website Listing (FURLS) of SCC Medical Devices
As an FDA Registered Medical Device Manufacturer, SCC must comply with 21 cfr 820
25
FDA – Verification vs. Validation
From 21 CFR (f) & (g): Design verification. Each manufacturer shall establish and maintain procedures for verifying the device design. Design verification shall confirm that the design output meets the design input requirements….. Design validation. …Design validation shall ensure that devices conform to defined user needs and intended uses and shall include testing of production units under actual or simulated use conditions…..
26
V & V at SCC Validation Verification
Performed independently of product development Meet end user requirements Focused on functionality Verification Primarily performed by product development Meet functional requirements Focused on code Both included in test plan Both required for release of software
27
Where is Validation Performed?
By SCC’s QC Team – Current Model Team comprised of both Lab and I.T. backgrounds Test cases derived from verification test cases (Redundant?) Little involvement early in development Compliant from a regulatory perspective Effective as they could be??
28
Reset of SCC QC Team Goals:
Transition from QC Team to Customer Validation Team Advocates for customers Involved early in development Risk based approach More robust regression testing Create unique test cases based upon best practice workflow
29
Reset of SCC QC Team Goals: Learn from post live feedback
Gather data from post live tasks Analyze data to see if we had the opportunity for prevention Go-Live Post Go-Live data Data Analysis Change in testing
30
Reset of SCC QC Team Goals: More closely mimic end user hardware
Meet or exceed hardware specifications Test using thick and thin clients Allows for better performance testing
31
Challenges What is best practice workflow? Different configurations Thousands of hosparams Numerous different workflows Millions of lines of code Conclusion: IT’S NOT POSSIBLE TO TEST EVERY FACET OF SOFTWARE
32
Does the FDA have any ideas?
From General Principles of Software Validation; Final Guidance for Industry and FDA Staff (2002): “…the complexity of most software prevents it from being exhaustively tested. Software testing is a necessary activity. However, in most cases software testing by itself is not sufficient to establish confidence that the software is fit for its intended use…”
33
Then What Do We Do? Back to The FDA Guidance Document:
“..In order to establish that confidence, software developers should use a mixture of methods and techniques to prevent software errors and to detect software errors that do occur..”
34
Dream! Mission: To develop a culture of defect prevention throughout the whole software development life cycle Vision: SCC, ISD and STS will avoid defects in SCC software to deliver the best product possible, creating user and customer loyalty Values: Forethought Client Focus Analytical thinking Absence of bias Influence throughout the Business
35
Dream! Defect Regression Avoidance Management
Phase I: Retrospective Review – Complete Heavy Data Mining Identify Root Causes of Released Defects Phase II: Define Corrective Actions – In Progress Execute the corrective actions to resolve the issue(s) and to prevent recurrence
36
End User Validation Test Plan
Same hardware, configuration, and connectivity Include performance testing Validate to intended use
37
Questions?
38
References U.S. Food and Drug Administration (January 11, 2002). General Principles of Software Validation; Final Guidance for Industry and FDA Staff. Retrieved from: /DeviceRegulationandGuidance/GuidanceDocuments/ucm pdf 21 CFR 820
39
Kenton Smith kentons@softcomputer.com 727-789-0100 Ext. 4901
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.