Computer System Validation What is it?

Slides:



Advertisements
Similar presentations
Test process essentials Riitta Viitamäki,
Advertisements

Radiopharmaceutical Production
Quality Assurance Documentation Procedures and Records Stacy M. Howard, MT(ASCP)
ISO 9001:2000 Documentation Requirements
Exercise 1 TDT 4235 Tor Stålhane IDI / NTNU. Intro The strength of ISO9001 and many other standards is that they focus on “What shall be done” and leave.
Robert D. Walla, Larry A. Hacker, Ph.D. Astrix Technology Group 1090 King Georges Post Rd Edison, NJ LIMS Selection In A Forensic Toxicology Laboratory.
Audit of IT Systems SARQA / DKG Scandinavian Conference, October 2002, Copenhagen Sue Gregory.
EPSON STAMPING ISO REV 1 2/10/2000.
New GAMP Good Practice Guide for Electronic Record and Signature Compliance Arthur D. Perez, Ph.D. Chairman, GAMP Americas.
Validation, Verification, Qualification : Which is right and does it really matter?
Regulatory Compliant Performance Improvement for Pharmaceutical Plants AIChE New Jersey Section 01/13/2004 Murugan Govindasamy Pfizer Inc.
A Seminar on 1.  Validation vs Qualification  Why to validate?  Who should do Equipment Validation?  Parts of Equipment Validation  Validation of.
Biomedical Auditing. The Biomedical Auditor works with medical devices, including in vitro diagnostics and biologics that are regulated as medical devices.
1 Certification Chapter 14, Storey. 2 Topics  What is certification?  Various forms of certification  The process of system certification (the planning.
OHT 2.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 What is software? Software errors, faults and failures Classification.
Copyright © 2006 Software Quality Research Laboratory DANSE Software Quality Assurance Tom Swain Software Quality Research Laboratory University of Tennessee.
ISO 9001 Interpretation : Exclusions
Prepared by Long Island Quality Associates, Inc. ISO 9001:2000 Documentation Requirements Based on ISO/TC 176/SC 2 March 2001.
Laboratory Information Management Systems (LIMS) Lindy A. Brigham Div of Plant Pathology and Microbiology Department of Plant Sciences PLS 595D Regulatory.
Session 6: Data Integrity and Inspection of e-Clinical Computerized Systems May 15, 2011 | Beijing, China Kim Nitahara Principal Consultant and CEO META.
 Mark & Sons Future Technology Co. (hereafter, MSFT) is a $40 billion public company that provides high-technology products and services.  Currently,
Pharmaceutical Regulatory and Compliance Congress and Best Practices Forum 21 CFR Part 11 Considerations November 14, 2002.
4. Quality Management System (QMS)
MCC PRESENTATION - GMP MANUFACTURING ENVIRONMENTS
Huzairy Hassan School of Bioprocess Engineering UniMAP.
EQUIPMENT QUALIFICATION/VALIDATION
DRAFT Richard Chandler-Mant – R Consultant The Challenges of Validating R Managing R in a Commercial Environment.
Regulatory Overview.
MethodGXP The Solution for the Confusion.
Kyle McDuffie, Vice President Beckman User Meeting 2001 Delaware. Orlando. Holland. UK Instrument Integration and Regulatory Compliance.
Introduction to ISO New and modified requirements.
Evolving IT Framework Standards (Compliance and IT)
1. Topics to be discussed Introduction Objectives Testing Life Cycle Verification Vs Validation Testing Methodology Testing Levels 2.
FDA Regulatory review in Minutes: What Product Development Executives Need-to-Know. Specifically, frequent causes of recalls and related areas that investigators.
CPIS 357 Software Quality & Testing I.Rehab Bahaaddin Ashary Faculty of Computing and Information Technology Information Systems Department Fall 2010.
Producing Valid Results (Risk Mitigation and Measurement Assurance)
PERSONNEL TRAINING IN BIOANALYSIS DR. SHIVPRAKASH MANAGING DIRECTOR SYNCHRON RESEARCH SERVICES PVT. LTD., INDIA.
Development and Regulation of Medical Products (MEDR-101)
Product Development Chapter 6. Definitions needed: Verification: The process of evaluating compliance to regulations, standards, or specifications.
Important informations
Leukocyte-Reduced Blood Components Lore Fields MT(ASCP)SBB Consumer Safety Officer, DBA, OBRR, CBER September 16, 2009.
School of Health Sciences Week 8! AHIMA Practice Briefs Healthcare Delivery & Information Management HI 125 Instructor: Alisa Hayes, MSA, RHIA, CCRC.
Design Documentation Clint Kehres, Brian Krouse, Jenn Shafner.
Specific Safety Requirements on Safety Assessment and Safety Cases for Predisposal Management of Radioactive Waste – GSR Part 5.
Part 11 Public Meeting PEERS Questions & Responses The opinions expressed here belong to PEERS members and not the corporate entities with which they are.
The world leader in serving science Validation and Qualification Overview Mike Garry Software Product Manager Spectroscopy Software Platform Team.
© Michael Crosby and Charles Sacker, 2001 Systematic Software Reviews Software reviews are a “quality improvement process for written material”.
Introduction to Codes, Standards, and Regulations Chattanooga State CC.
6/11/04Part 11 Public Meeting1 Risk-Based Approach Scott M Revolinski Washington Safety Management Solutions Carolyn Apperson-Hansen Cleveland Clinic Foundation.
ISO 9001:2015 Subject: Quality Management System Clause 8 - Operation
Workshop on Accreditation of Bodies Certifying Medical Devices Kiev, November 2014.
Changing IT Managing Networks in a New Reality Alex Bakman Founder and CEO Ecora Software.
1. Our Presentation Topic: Importance Of Validation & Qualification In Pharmaceutical Industries Presented By: Md. Tanjir Islam (Group C) 2.
GOOD MANUFACTURING PRACTICE FOR BIOPROCESS ENGINEERING (ERT 425)
Water for Pharmaceutical Use
Autotest – Calibrated Leak
This teaching material has been made freely available by the KEMRI-Wellcome Trust (Kilifi, Kenya). You can freely download,
Fit-for-Purpose Program: Solving Problems with the Validation of Legacy Systems Joseph Schenk QA Edge, Inc. (302) x11
What is software quality?
בקרה תוך שימוש ב 21CFR Part 11 / אילן שעיה סמארט לוג'יק
FDA 21 CFR Part 11 Overview June 10, 2006.
יוסי שדמתי רק איכות מניהול סיכונים לאימות ותיקוף תהליכי הרכבה From Risk Management to Processes Validation יוסי.
Quality Assurance Documentation
Medical Device Design and Development
What is software quality?
Engineering Processes
Interview Task 1.
Software Reviews.
Computer System Validation
Radiopharmaceutical Production
Presentation transcript:

Computer System Validation What is it? My Name is Adam Woodjetts I have been a validation consultant with Instem for over 7 years, carrying out many on site validations of our software, including clinical instrument interfaces, and our dispense application. For this session we are going to look Computer System Validation and try to explain What it is and Why it has to be done. Adam Woodjetts Validation Consultant, Instem

Why Validate your Software? Why do you have to go through the often tedious and almost always resource and time intensive process of validation? Why Validate your Software?

Regulatory Requirement Good “X” Practice Collection of FDA Guidelines adopted as regulations then laws Manufacturing, Clinical, Laboratory 21 CFR (Code of Federal Regulations) - FDA Part 58 - GLP for Nonclinical Laboratory Studies Part 11 - Electronic Records ; electronic signatures Establishing evidence and confidence in consistent operation according to specifications Documenting and Systematically Challenging and Testing Reduce risk to patients Validation of “systems”, the processes, software and instruments, used in the production of pharmaceuticals, chemicals, and medical devices, is a regulatory requirement. <Click> Good X Industry regulators refer to G X P, Good “something” Practice where the X could be Manufacturing, Clinical, Laboratory, and many more. It started with the FDA (US Food and Drugs Agency) developing GMP (Good Manufacturing Practice) guidelines, these guidelines were then adopted as regulations and consequently law. The major regulations from the FDA are: “Title 21 CFR Part 58” which details expectations for GLP studies “Title 21 CFR Part 11” which deals with electronic records and electronic signatures. There are more regulations and guidance documents within the FDA, and there are similar regulations applied by other agencies: EMA in Europe (European Medicines Agency), MHRA in Great Britain (Medicines & Healthcare products Regulatory Agency) CFDA in China (China Food and Drug Administration). <Click> Establish Evidence Companies involved with the development, manufacture and use of items in the pharmaceutical industry must be able to provide evidence of confidence in software and associated processes; to operate consistently, predictably, and according to the specifications of the manufacturer. This means understanding and documenting what you want or need the system to do, and demonstrating with objective evidence in a methodical structured manner that it does it. <Click> Reduce Risk The objective of this process is that by ensuring reliable and quality data collection, and or manufacturing processes, the risk to patients is minimised <Click> Validation of Software The computer software and associated processes and instruments used must be validated to meet regulatory expectations. <Click> Evidence of Confidence Validation is the provision of evidence demonstrating confidence in the software’s ability to perform as required in a consistent manner. Validation must demonstrate “suitability for purpose” of the software, instruments, and associated processes. Validation of software is required to fulfil regulatory demands Evidence of confidence in “suitability for purpose”

Demonstrating “Suitability for Purpose” It is a basic statement, but this is the primary objective of Computer System Validation, or CSV. We are going to try and describe and expand on this, showing what “suitability for purpose” means, and how it can be demonstrated… Demonstrating “Suitability for Purpose”

User Requirements Why do you use a specific system? User Requirements What functionality do you utilise? How did you make the selection? User Requirements Clear and objective statements Regulatory Requirements How do you want to use the software? Work flow diagrams Standard Operating Procedures As Users you have selected, purchased, and use selected pieces of software or instrumentation for a reason. <Click> Why? Why do you use a specific “system”? It may offer some functionality which you utilise, it saves you time, it provides consistency. Was this the only instrument or piece of software you looked at when making the selection decision? How did you know it was the right system for you? <Click> User Requirements The answer to these questions would normally be found in “User Requirements”. Whether a list of statements in a document, or a complex matrix within a spreadsheet, what the ‘system’ needs to be capable of, or enable, should be established in order to make the decision to use it. Requirements should be objective, clearly worded statements. What does “ The system should be Fast and Responsive” actually mean? One persons expectations of a “Fast” system may be different from those of their colleagues. It is better to state that “Formulations should be processed in X (so many seconds) seconds”. User requirements should consider regulatory expectations, such as: Time Stamped audit trails – which enables ‘data recreation’ and traceability Generation of complete records in paper and electronic media – potentially upon request by auditors or inspectors Retention of records in a retrievable state for a specified period All of these items are part of 21 CFR Part 11 – Electronic Records and Signatures <Click> How do you want to use the software As well as knowing what the software needs to do, it should be understood how the software will be used, the processes it must enable, or integrate with. This information can be presented as Workflow diagrams, a graphic representation of the processes and procedures the system must follow, or documented as Standard Operating Procedures. <Click> Basis of CSV Documenting what you want the software to do and how you want to do it, forms the basis of the CSV process. Without a clear set of requirements there is no way of knowing what anyone expects of the ‘system’, therefore how can it be proven to be suitable? Basis of Computer System Validation

Risk Analysis Justification for exclusion of requirements Establish Importance or Impact of requirements Grade or score each requirement using chosen method Probability of requirement ‘Failure’ How often is it “exercised” How Complex is the associated process Impact of requirement ‘Failure’ Alternative methods Production stops Cost to resolve Combine to get score/grade Having a structured collection of requirements will allow them to be prioritised through Risk Analysis. <Click> Importance Not everything in the list of requirements will prevent the ‘system’ from working, or your processes from being performed, and there will be some requirements that if they are not fulfilled it would or could be catastrophic – potentially effecting patients! Risk analysis allows requirements to be ‘graded’ or ‘scored’. There are several possible methods, but a simple way is to consider the combination of the Probability vs the Impact of a requirement failure – requirement not being met. <Click> Probability The probability or likelihood of a failure is linked to how often the requirement is “exercised”, or how complex the process is. The second point is easier to consider: the theory is that the more complex the process, the more likely it is to fail. However there are at least two possible views regarding how often a requirement is exercised: If its the basic function of the ‘system’, surely it would not be released if it didn’t work properly? or It is functionality used many times, therefore the chance of something failing is increased Whatever the opinion on this, it should be applied with a degree of consistency, giving the requirement a grade/score on the probability scale. <Click> Impact If a requirement is not met, what is the impact? There are a number of things which should be considered including (but not limited to): Are there alternative methods of achieving the same thing? Will failure stop production? What is the cost to resolve the failure? and Patient safety considerations This will give the requirement a grade/score on the Impact scale <Click> Combine Using the very basic Green, Amber, Red example displayed here, a requirement which has a high probability of failure but low impact, is graded Amber… What happens with this grading is dependent on each individual organisation's situation. For example, if time and resources are limited, it might be possible to justify the decision to NOT test requirements with a Grading of Amber. A similar alternative method would be to use a number based grading scale, with more levels for each axis of the table. As long as each score is described and applied consistently. Then the decision on whether to test could be based on a combined scoring higher than a pre determined level. <Click> Justification Risk Analysis of user requirements can provide the evidence to justify the exclusion of requirements from testing and therefore potentially reduce the cost and duration of validation. <Click> Prioritise or Focus Risk Analysis can also demonstrate the areas in which testing should be focused, where is it more important to demonstrate correct functionality. Justification for exclusion of requirements Prioritise or Focus testing efforts

Testing your Requirements Establishing environment conforms to manufactures specs (IQ) Functional Testing or Operational Qualification Requirements mapped to manual test scripts Automated test tools User Acceptance Testing or Performance Qualification Testing the workflow and operating procedures Carried out by your users Having established what the ‘system’ is required to do, it is necessary to demonstrate that these Requirements are met. <Click> Installation Qualification It is important to test and document that the environment on which the system is to be used, has been correctly configured to conform to manufacture’s specifications. Has software been installed correctly on appropriate hardware? Do instruments communicate with associated software as expected? This is another collection of documentation and testing which is included in the validation package. <Click> Functional Testing It is necessary to demonstrate that the “functionality” of software or instruments has been considered. This can be achieved by executing a suite of manual test scripts, mapped to individual requirements. The test scripts form part of the documentary evidence which can be used to demonstrate the system’s “suitability”. It is also possible to utilise automated test tools which can execute tests autonomously once started. <Click> UAT Having proven that the software functions correctly, it is necessary to show that the software can be used the way you wish to use it. This area of validation is of greater value than the preceding stage, as it will demonstrate the desired workflow with “realistic” data, using tests executed by actual users. <Click> Package of Evidence Executing these tests will provide you with a package of evidence testing your requirements, demonstrating that the system is suitable for its intended purpose. Documentary evidence of requirement fulfilment

Record Faults and Incidents Record any testing incidents Symptoms Cause Consequences – Impact – Resolution Deviations from “the Plan” How Why Consequences – Impact Environment Changes During validation not everything will work, and things don’t always go according to plan. <Click> Test Incidents If incidents are encountered during testing, the symptoms should be recorded to enable investigation. Even if it is not known at the time the incident is encountered, the cause should be established. The immediate consequence of the incident, as well as any impacts there may be in the future use of the system should be recorded, not just when the incident occurs, but as on going investigations start to understand what has happened. A resolution may not be a “fix” to the problem, but a method of continuing the test by working around the problem, or actions which need to be taken in the future before using the system in a live environment. <Click>Deviations from “the Plan” The “Validation Plan” is something yet to be discussed, but if you deviate from a planned set of actions, it should be recorded. How have you deviated, why, and what are the consequences. For example: If you discovered that an instrument has started malfunctioning and must be replaced. This might need a new test to be written, which has to be run at another time… It’s not a bad thing, and the user requirements (once reviewed) will still be tested… but not as planned – due to a change in environment and test scripts to be executed. <Click> Changes to the Environment In a similar fashion to incidents and deviations, changes to the environment you are validating should be recorded. Using the previous example, instrument 1 was intended to be validated. But replacing it is a change in the validation environment, especially if there are any installation and configuration activities related to either instrument. <Click> Record of what has happened Validation can be considered a record of how requirements are proven, and what happened when it was done. Incidents don’t have to be “fixed” but the resolutions, the actions taken to mitigate the fault, or work round the situation, must be recorded. <Click> Usable in the Distant Future The first time the validation package is reviewed may be some time after the validation activities finished. Even if the same staff are available, they may not remember the exact scenario leading to an incident or change. When recording your incidents consider – Can you repeat, understand or recreate the incident with the information provided? Validation is a record of what has happened Can the issue be ‘recreated’ with the information recorded?

Packaging the Documentary Evidence Create a structured Validation Plan Environment and Installation details (IQ) What is being testing and Why How requirements are proven (demonstrated) Automated and Manual tests Leveraging vendor Test Evidence Summary Report What are your findings/Conclusions? Can the software be used with confidence? Deviations from the Validation Plan What evidence supports your findings? It is not sufficient to have a collection of requirements and just run some tests and make the statement “it is suitable” and therefore validated. <Click> Validation Plan The test evidence must be viewed as part of a structured package of information which starts with a validation plan detailing: Environment and configuration of the system, or Installation Qualification. What is being tested – which applications, user requirements, or groups of requirements ARE being tested, and which are NOT, referring to the Change Impact and Risk analysis processes <Click> How are requirements proven It is not necessary to demonstrate/test all requirements using the same methods. With the potential choice of automated or manual tests, and the possibility of testing at different stages of the validation, the Validation Plan should detail which methods are being used to demonstrate what User Requirements, with the justification for those decisions. <Click> Summary Report In essence, the summary report is a statement of whether it has been proven the software is suitable for purpose. It should include any deviations from the Validation Plan, and details of any incidents encountered during the validation, including what the impact is, and how they are to be resolved or mitigated. The summary report is not a replacement for all of the validation evidence, but an initial point of reference (for auditory or regulatory bodies for instance). Therefore the report should detail what evidence is available, and where it can be found. Validation starts with a Plan – the what, how and why, which guides the execution activities and determines what documentary evidence will be generated <Click>Validation Report Validation ends with a summary report – was the plan followed, has it been demonstrated that the system works the way you intend to use it. Validation Plan – What How and Why your are validating Validation Summary Report – Demonstrate “suitability for purpose”

Questions Retrospective Evaluation – Already using the software Do an Gap analysis of what you have and what you should have. Fill the gaps… perform the validation! Maintaining validated state What changes may effect the behaviour of the system What tests need to be executed to ensure they are tested – user requirements risk analysis. Change impact analysis Questions