Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Quality Assurance. Quality assurance vs. Quality control.

Similar presentations


Presentation on theme: "Introduction to Quality Assurance. Quality assurance vs. Quality control."— Presentation transcript:

1 Introduction to Quality Assurance

2 Quality assurance vs. Quality control

3 Quality Assurance Program designed to monitor and evaluate the ongoing and overall quality of the total testing process (preanalytic, analytic, and postanalytic)

4 Quality Control Activities designed to monitor and evaluate the performance of instruments and reagents used in the testing process Is a component of a QA program

5 CLIA Clinical Laboratory Improvement Act –Resulted from public and Congressional concerns about the quality of clinical laboratory testing in the U.S. –Basic set of guidelines to apply to all labs, regardless of size, complexity, or location. –Implementation and development of working guideline was assigned to HCFA (Health Care Finance Agency), now known as CMS (Center for Medicare and Medicaid Services).

6 CLIA The intent of CLIA is to promote the development, implementation, delivery, monitoring, and improvement of high quality laboratory services.

7 Original consisted of 4 sets of rules describing:  Laboratory standards  Personnel standards  Quality control requirements  Test complexity model  Quality assessment of the complete testing process  Application process and user fees  Enforcement procedures  Approval of accreditation programs CLIA

8 Total Testing Process Pre-Analytic Analytic Post-Analytic Physician order Patient preparation Specimen acquisition Specimen handling Sample transport Sample prep Analyzer setup Test calibration Quality Control Sample analysis Test report Transmittal of report Receipt of report Review of test results Action on test results

9 Quality Assurance activities Patient test management assessment - specimen collection, labeling, transport - test requisition - specimen rejection - test report format and reporting systems Quality control assessment - calibrations and controls - patient data ranges - reporting errors

10 Quality Assurance activities (cont.) Proficiency testing assessment - “unknowns” 2-3x/year Comparison of test results - different assays or instruments used for same test - accuracy and reproducibility

11 Quality Assurance activities (cont.) Relationship of patient info. to test results - results consistent with patient info. - age, sex, diagnosis, other results Personnel assessment - education; competency

12 Quality Assurance activities (cont.) Communications and complaint investigations - communications log QA review with staff - review during regular meetings

13 Quality Assurance activities (cont.) QA records - retention for 2 years Verification of methods - accuracy, precision - analytical sensitivity and specificity - reportable range - reference range(s) (normal values)

14 Quality Assurance activities (cont.) Quality monitors - TAT (turn-around time) - smear/culture correlation - blood culture contamination rates

15 Assessment of compliance College of American Pathologists (CAP) - Professional pathology organization - Been granted “deemed status” by CMS - Groups of peers conduct bi-annual site inspections - Publish checklists for laboratories to document compliance

16

17

18 How do we assess the performance of our tests?

19 Verification vs. Validation

20 Verification One-time process used to evaluate or establish the performance of a system or test to ensure that it meets the desired specifications

21 Validation Ongoing process to demonstrate that a system or test is meeting the operational needs of the user

22 Verification Background CLIA requirement to check (verify) the manufacturer’s performance specifications provided in package insert –Assures that the test is performing as intended by the manufacturer »Your testing personnel »Your patient population »Your laboratory setting –One time process performed prior to implementation

23 Verification Accuracy Are your test results correct? –Assures that the test is performing as intended by the manufacturer »Use QC materials, PT materials, or previously tested patient specimens

24 Verification Precision Can you obtain the same test result time after time? –Same samples on same/different days (reproducible) –Tested by different lab personnel (operator variance)

25 Verification Reportable Range How high and how low can test values be and still be accurate (qualitative)? –Choose samples with known values at high and low end of range claimed by manufacturer What is the range where the test is linear (quantitative)? –Test samples across the range

26 Verification Reference ranges/intervals (normal values) Do the reference ranges provided by the test system’s manufacturer fit your patient population? –Start with manufacturer’s suggested ranges –Use published ranges »Can vary based on type of patient »May need to adjust over time »Normal patients should be within range, abnormal patients should be outside range

27 Verification Number of samples to test Depends on the test system and laboratory testing volume –FDA-approved: 20 positive and negatives –Non-FDA approved: 50 positive and negatives The number used for each part of the verification will vary Laboratory director must review and approve results before reporting patient results

28 Sensitivity The probability of a positive test result given the presence of disease How good is the test at detecting infection in those who have the disease? A sensitive test will rarely miss people who have the disease (few false negatives).

29 Specificity The probability of a negative test result given the absence of disease. How good is the test at calling uninfected people negative? A specific test will rarely misclassify people without the disease as infected (few false positives).

30 Sensitivity and Specificity DISEASE TEST Sensitivity = TP/TP+FN Specificity = TN/TN+FP

31 Predictive Value The probability of the presence or absence of disease given the results of a test –PVP is the probability of disease in a patient with a positive test result. –PVN is the probability of not having disease when the test result is negative.

32 Predictive Value Predictive Value Positive (PVP) = TP/TP+FP Predictive Value Negative (PVN) =TN/TN+FN DISEASE TEST

33 Predictive Value How predictive is this test result for this particular patient? Determined by the sensitivity and specificity of the test, and the prevalence rate of disease in the population being tested.

34 Prevalence Rate Number of cases of illness existing at a given time divided by the population at risk

35

36 Hypothetical Influenza Test Performance Prevalence = 20.0% Disease Test Sensitivity = 380/400 = 95.0% Specificity = 1536/1600 = 96.0% Predictive Value Positive (PVP) = 380/444 = 85.6% Predictive Value Negative (PVN) = 1536/1556 = 98.7%

37 Hypothetical Influenza Test Performance Prevalence = 1.0% Disease Test Sensitivity = 19/20 = 95.0% Specificity = 1900/1980 = 96.0% Predictive Value Positive (PVP) = 19/99 = 19.2% Predictive Value Negative (PVN) = 1900/1901 = 99.9%

38 Predictive Value Positive: Dependence on Sensitivity, Specificity and Prevalence

39 Resources CAP checklists (available on W: drive) Clark, RB et al. Verification and Validation of Procedures in the Clinical Microbiology Laboratory. 2009. Cumitech 31A, ASM Press


Download ppt "Introduction to Quality Assurance. Quality assurance vs. Quality control."

Similar presentations


Ads by Google