Presentation is loading. Please wait.

Presentation is loading. Please wait.

Kathryn E. Arnold MD Medical Officer, Division of Healthcare Quality Promotion 2012 CSTE Annual Conference June 3-7, 2012 National Strategy and Toolkit.

Similar presentations


Presentation on theme: "Kathryn E. Arnold MD Medical Officer, Division of Healthcare Quality Promotion 2012 CSTE Annual Conference June 3-7, 2012 National Strategy and Toolkit."— Presentation transcript:

1 Kathryn E. Arnold MD Medical Officer, Division of Healthcare Quality Promotion 2012 CSTE Annual Conference June 3-7, 2012 National Strategy and Toolkit for NHSN Data Validation National Center for Emerging and Zoonotic Infectious Diseases

2 HAI Data Validation is Important  Credible data are vital for prevention, public reporting, and incentivizing improvements in clinical performance  Concerns about uneven data quality Always important, now more than ever  Validation can improve fairness  Need for training on all levels  Validation findings help guide training

3 What Do We Mean by Validation?

4  Assure production of high quality surveillance data  Ability to generate correct denominator data  Ability to identify all candidate events in real time  Routine assessment and tracking of candidate events  Ability to correctly apply case-definitions  Minimized data-entry error

5 How Do We Develop a Standardized, Scalable Approach to Validation That Can Work in Any State?

6 States as Validation Laboratories, 2010-2011  States created innovative approaches under ARRA  Central Line-Associated Bloodstream Infection (CLABSI): Structure of sampling frame Numerator sampling approaches Checklists for case-classification Denominator methods surveys Risk-factor (location mapping) investigations  Surgical Site Infection (SSI): Data linkage to enrich targeted samples (procedures) for SSI In house and post-discharge case-finding surveys Risk-factor audits in access database * Citations, references, and credits – Myriad Pro, 11pt

7 ID HI VT MA NH RI CT OH NJ MD DC PA MD DE WV PA PR CLABSI Externally Validated by State, as of 2012 Dots: CLABSI Mandate by 2012

8 ID HI ME VT MA NH RI CT OH NJ DC PA MD DE WV PA PR SSI Externally Validated by State, as of 2012 Dots: SSI Mandate by 2012

9 State and CMS Validation are Complementary, but Different StateCMS ApproachDiffers state-by-stateNationwide probability sample Constrained by Statute (access to data), and resources Statute (scope), resources, and existing infrastructure ValidatesNumerator; denominator methods; risk adjustment variables; Numerator SamplingVaries; often targetedSmall sample from all IPPS hospitals, at least every 4 years Primary goals Improve surveillance practices; understand weaknesses for teaching; optimize data quality at all levels Assure compliance; validate accuracy of metric; motivate internal improvement

10 National Strategy for NHSN Data Validation  Document and characterize need for NHSN validation  Recognize CMS role in motivating facility engagement  Demonstrate unique value of states in conducting NHSN validation  Because ALL data cannot be validated, states use data to assure competence, identify weaknesses in surveillance, and enable improvement by teaching  Develop guidance, determine costs  Identify funding  Sustain and enhance capacity  Harmonize work among stakeholders * Citations, references, and credits – Myriad Pro, 11pt

11 2012 Validation Guidance and Toolkit: CLABSI and SSI  Chapter 1:  Overview and Framework Intrinsic (built-in) validation Internal (to NHSN and reporters) validation External (to NHSN or reporters) validation  Types of External Validation  Examples of SHD Validation Approaches Targeted External Validation Probability Samples for External Validation Hybrid approaches * Citations, references, and credits – Myriad Pro, 11pt

12 Approaches to External Validation  Targeted External Validation, TN (others)  Perfect for efficiently improving data quality and teaching to reporting errors  Probability Samples OR (CT, CMS, WA)  Needed for extrapolation of performance estimates, and preferred for longitudinal assessment. * Citations, references, and credits – Myriad Pro, 11pt

13 Chapters 2-4: CLABSI  Internal validation (Quality Assurance)  For reporting facilities  For group users  Targeted External Validation  External Validation using Probability Samples  CLABSI Validation Tools * Citations, references, and credits – Myriad Pro, 11pt

14 CLABSI Validation Tools  Access Database (New York)  Facility Self-Validation Tool  Denominator Collection Methods Survey  Algorithmic Use of NHSN Analysis to Target Facilities  Example Letter Requesting External Validation Site Visit  Checklists for Validation (Tennessee)  Template for Audit Discrepancies Report  Example Validation Follow-up Letters, With and Without Problems  Scalable Self-weighting Sample Using Probability Proportional to Size * Citations, references, and credits – Myriad Pro, 11pt

15 Chapters 5-7: SSI  Internal validation (Quality Assurance)  For reporters  For group users  Targeted External Validation  External Validation using Probability Samples  SSI Validation Tools * Citations, references, and credits – Myriad Pro, 11pt

16 SSI Tools  Expected and Unusual Values for Surgery Variables  Admission Surveillance Practices Survey  Post-Discharge Surveillance Practices Survey  Developing an Enriched Sampling Frame for Targeted SSI Validation  ICD-9 Procedure Codes, and ICD-9 Diagnostic Codes Suggestive of SSIs  Expected Length of Stay for NHSN Procedures * Citations, references, and credits – Myriad Pro, 11pt

17 Quality Improvement for the Toolkit  Post-Validation Analysis to Help with Future Iterations of the Toolkit  Rate the Toolkit * Citations, references, and credits – Myriad Pro, 11pt

18 Pre-clearance Input  We are not seeking to distribute the document widely yet; we are seeking feedback  We invite reviewers who are willing to read and provide meaningful input for this first (pre-clearance) iteration of the Guidance and Toolkit  If you are interested, please let us know (KEA3@CDC.GOV)KEA3@CDC.GOV  Please come to Rachel Stricof’s Roundtable for more discussion of targeted vs. probability sampling  Roundtable Tuesday 5:45 Herndon * Citations, references, and credits – Myriad Pro, 11pt

19 Thank You ! Thank You ! (CSTE) Rachel Stricof (State Partners) Lynn Janssen (CA), Richard Melchreit (CT), Carole Van Antwerpen and Valerie Haley (NY), Paul Cieslak and Zintars Beldavs (OR), Marion Kainer and Brynn Berger (TN), David Birnbaum (WA), Many others (CDC) James Baggs, Maggie Dudeck, Jonathan Edwards, Ryan Fagan, Scott Fridkin, Teresa Horan, Paul Malpiedi, Daniel Pollock, Cathy Rebmann, Philip Ricks, Dawn Sievert, Arjun Srinivasan, Nicola Thompson, Elizabeth Zell The findings and conclusions in this report are those of the author and do not necessarily represent the official position of the Centers for Disease Control and Prevention. National Center for Emerging and Zoonotic Infectious Diseases


Download ppt "Kathryn E. Arnold MD Medical Officer, Division of Healthcare Quality Promotion 2012 CSTE Annual Conference June 3-7, 2012 National Strategy and Toolkit."

Similar presentations


Ads by Google