©2014 MFMER | slide-1 So many questions, So much data, So little time A Journey with Surgical Outcome Data Alison M. Knight, P.E. IIE 2014 National Conference.

Slides:



Advertisements
Similar presentations
Future directions for CHAs Benchmarking Member Service.
Advertisements

Using Health Economic Framework to Determine the Benefits of Participating in a Surgical Outcomes Measurement Program Linda Dempster, RN MA Quality and.
Revenue Cycle Benchmarking Going Beyond… To Improve Revenue Cycle Outcomes Presented by: Frank Giannantonio President.
OUR NSQIP JOURNEY Drilling Down NSQIP Data Nanaimo Regional General Hospital Kelli Jennison-Gustafson RN SCR CNE.
©2014 MFMER | slide-1 Going Against the Grain Improving Processes Related to Patient Comorbidities Alison M. Knight, P.E. IIE 2014 National Conference.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Thomas Kelley, MD Chief of Quality and Transformation Orlando Health Leading the Way to Better Care: Florida’s Quality Journey.
ACHD Patients Should Receive Treatment in Adult Institutions Society of Thoracic Surgeons Adult Cardiac Surgery Database reports 39,872 adults undergoing.
Electronic Medical Record Use and the Quality of Care in Physician Offices National Conference on Health Statistics August 17, 2010 Chun-Ju (Janey) Hsiao,
Behavioral Health Coding that Works in Primary Care Mary Jean Mork, LCSW April 16 & 17, 2009.
Trauma Data Use: A Trauma Physician’s Point of View Frederick A. Foss, Jr. M.D. F.A.C.S Trauma Medical Director Saint Alphonsus Regional Medical Center.
Multiple Choice Questions for discussion
Collecting Quantitative Data
Neurosurgery and Quality Improvement: A Pay for Participation Model PFP Summit Concurrent Sessions III Robert E. Harbaugh, MD, FACS, FAHA University Distinguished.
by Joint Commission International (JCI)
ACHA Policy Advisory Council March 15, Public Reporting  Jeffrey Bott, MD, MBA President of the Florida Society of Cardiovascular and Thoracic.
Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations.
American College of Surgeons.  Web-Based data collection program  Quality improvement tool  National Benchmarking  Surgical outcomes data What ACS.
The effect of surgeon volume on procedure selection in non-small cell lung cancer surgeries Dr. Christian Finley MD MPH FRCSC McMaster University.
April 10 & 12, 2012 Conference Call Summary Approach & Ideas to Creating Reports Presented by: Katherine S. Rowell, MS, MHA.
BC SCR Call April 11, Agenda ? June 7 SQAN/BC NSQIP Meeting SAR Questions or Requests for Site Visits Workflow reports on the ACS Workstation.
Functional Behavioral Assessment. Functional Behavior Assessment or Functional Assessment is a set of processes for defining the events in an environment.
FHC NH Partnership for Patients Our charge is clear: reduce preventable harm by 40% and reduce preventable readmissions by 20% by 2013.
Health Research & Information Division, ESRI, Dublin, July 2008 The Audit Process.
Putting the Tools to Use: One Hospital’s Experiences Donna Farley, PhD – RAND Ellen Robinson, PT ATC – Harborview Medical Center.
DRAFT – final pending AHRQ approval 1 Deep-Rooting Your Data Liza Wick, MD Deb Hobson, RN.
How to Get Started with JCI Accreditation. 2 The Accreditation Journey: General Suggestions The importance of leadership commitment: Board, CEO, and clinical.
MAINTENANCE OF CERTIFICATION Part IV Practice Performance and Improvement David L. Gillespie MD, FACS Professor of Surgery University of Rochester School.
NSQIP Admin Leads Meeting Agenda October 12 th CPT coding update – Angela Tecson 2.BC NCSQIP Collaborative MOU – Marlies van Dijk and Susan Scrivens.
Medication Errors and Staffing Ratio Barlow Bird, Cindy Olsen, Judy Wilkin, Loran Greenwall, Sarah Williams Methodology CLINICAL QUESTION CONCLUSION Evidence.
Economic evaluation of health programmes Department of Epidemiology, Biostatistics and Occupational Health Class no. 19: Economic Evaluation using Patient-Level.
Bariatric Surgery: Outcomes and Safety MISS 2010 Bruce M. Wolfe, MD Professor of Surgery Oregon Health & Science University.
Henry Domenico Vanderbilt University Medical Center.
Office of Statewide Health Planning and Development Day for Night: Hospital Admissions for Day Surgery Patients in California, 2005 Mary Tran, PhD, MPH.
ACS NSQIP: Preventing complications Reducing costs Improving surgical care May 17, 2014 Scott Ellner, DO, MPH, FACS Saint Francis Hospital and Medical.
Module 3. Session Clinical Audit Prepared by J Moorman.
Lessons Learned from the Society of Thoracic Surgeons (STS) Congenital Database September 25, 2015 Robert J. Dabal, MD Associate Professor of Surgery.
UTIs at SPH Adrienne Melck, MD, MPH, FRCSC Division of General Surgery Meghan MacLeod, MSc Quality Improvement Specialist.
Difference in short-term comlication between sipnal and general anesthesia for Primary TKA Andrew J pugely MD University of Iowa Hospital.
RTCC Performance Improvement South East Regional Trauma Coordinating Committee Meeting January 9, 2009 Temecula, CA.
Thomas Kelley, MD Chief of Quality and Transformation Orlando Health Leading the Way to Better Care: Florida’s Quality Journey.
BC Jung A Brief Introduction to Epidemiology - XIII (Critiquing the Research: Statistical Considerations) Betty C. Jung, RN, MPH, CHES.
Deep-rooting your data CUSP FOR SAFE SURGERY: SURGICAL UNIT-BASED SAFETY PROGRAM (SUSP) Elizabeth Wick, MD November 11, 2014.
Data Quality Improvement This material was developed by Johns Hopkins University, funded by the Department of Health and Human Services, Office of the.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Unit 11.2a: Data Quality Attributes Data Quality Improvement Component 12/Unit 11 Health IT Workforce Curriculum Version 1.0/Fall
Data Driven Clinical Engagement. © Cerner Corporation. All rights reserved. This document contains Cerner confidential and/or proprietary information.
Comprehensive Health Insurance: Billing, Coding, and Reimbursement Deborah Vines, Elizabeth Rollins, Ann Braceland, Nancy H. Wright, and Judith S. Haynes.
Course: Research in Biomedicine and Health III Seminar 5: Critical assessment of evidence.
1 Challenges and Opportunities In Managing Diabetes and Hyperglycemia Module 2 Diabetes Special Interest Group Georgia Hospital Association.
Integrated Population Health Bill Young Product Strategist/Principal Thomas Hohner Senior Report Developer.
RCS Invited Reviews Ralph Tomlinson Head of Invited Reviews.
ASC Quality Measure Reporting Ann Shimek, MSN, RN, CASC Senior Vice President Clinical Operations United Surgical Partners International.
Clinical Project Meeting NYHQ PPS Delivery System Reform Incentive Payment (DSRIP) Project Implementation Plan Development Asthma (3dii)
PATIENT SAFETY ORGANIZATION Michigan Surgical Quality Collaborative
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Copyright restrictions may apply JAMA Pediatrics Journal Club Slides: Preoperative Anemia and Postoperative Mortality in Neonates Goobie SM, Faraoni D,
Trauma Quality Improvement Program
Tools & Strategies Summary
Insert Objective 1 Insert Objective 2 Insert Objective 3.
Insert Objective 1 Insert Objective 2 Insert Objective 3.
Building the Foundation of Compliance
Building the Foundation of Compliance
ACS NSQIP Pediatric Semi-Annual Report
Semiannual Report, March 2015
Information for Patients Please return to reception
Services Provider to Manage (J Code) Specialty Drugs Charged to the Medical Benefit Plan not to the PBM HR Specialty Products & Services Catalogue Executive.
Risk adjustment using administrative and clinical data: model comparison
Nicholas H. Osborne, MD, MS, Clifford Y. Ko, MD, MS, Gilbert R
Surgical Champion Tool Kit
Presentation transcript:

©2014 MFMER | slide-1 So many questions, So much data, So little time A Journey with Surgical Outcome Data Alison M. Knight, P.E. IIE 2014 National Conference

©2014 MFMER | slide-2 Agenda Tutorial in quality databases Background information on project Strategy and process for report creation Example of report and metrics chosen

©2014 MFMER | slide-3 Quick Tutorial in Surgical Quality Databases The dirty secrets THEY don’t tell you

©2014 MFMER | slide-4 Lesson #1: Quality databases are NOT created equal Clinical Abstracted vs Administrative Data Outcome Definition Data Audit ProcessSample vs. Population Database Attributes

©2014 MFMER | slide-5 Lesson #2: Use all databases, but choose metrics carefully Mortality & Length of Stay Administrative Data & All Cases Surgical Site Infection & Urinary Tract Infection Clinically Abstracted Data & Sample of Cases Example #1: Example #2:

©2014 MFMER | slide-6 Lesson #3: Data denial will happen, Be ready! Five Stages of Data Denial* 1.The data is wrong 2.The data is old 3.We have changed already 4.This is a different place 5.We tried that already *Borrowed from Dr. Brian Postl, CEO of the Winnipeg Regional Health Authority Presentation

©2014 MFMER | slide-7 Let’s set the stage

©2014 MFMER | slide-8 The surgical practice is frustrated Each surgical specialty pays a portion New costs Most surgical specialties believed this quality data provided little value No perceived value Surgical specialties didn’t believe the data Data is not trustworthy

©2014 MFMER | slide-9 Some background facts…

©2014 MFMER | slide-10 What is NSQIP? American College of Surgeons (ACS)’s National Surgical Quality Improvement Program History Started in the Veterans Administration(1994)* Open to other hospitals (2004)* Mayo Clinic joined in 2006 Clinically abstracted random sample of cases *ACS NSQIP website:

©2014 MFMER | slide-11 What types of information is included and how is it collected? Uniform Operational Definitions 135 Patient Variables 30 day Post-op Assessment Audit Process

©2014 MFMER | slide-12 Data Delivery to the Practice Twice a year the practice chair received a “canned report” from NSQIP Each specialty had risk-adjusted odds ratios Ranged in granularity depending on specialty Highlighted the specialty’s decile rankings (1-10) Decile*: divides the distribution of the variable into ten groups having equal frequencies. *Source: dictionary.com

©2014 MFMER | slide-13 The Project

©2014 MFMER | slide-14 Project Goals Educate Educate providers on database facts Work through the stages of denial Provide Provide data at more detailed level Point providers to “hot spots” Change Aid specialty in developing data- driven quality projects

©2014 MFMER | slide-15 Identified the Challenges NSQIP costs money, value not recognized “Why should I pay for NSQIP if I am not using it?” Lack of understanding of data collection process: “Documentation errors are probably driving this data” Decile rankings are the focus “These rankings can’t possibly be right, our patients are sicker and we do more complicated procedures” Specialty practices only were recognized for negative quality “Leadership just doesn’t understand our practice”

©2014 MFMER | slide-16 The Plan to Create a Meaningful Dashboard Goals Practice Input Use in Research Address Concerns De-emphasize Rankings Focus on Good & Bad Make Data Actionable Dispel Myths

©2014 MFMER | slide-17 The process of development: PDSA-Like Cycle One specialty at a time Analysis done by hand Review report for completeness Obtain feedback from each specialty for improvements CreateReviewPresentRevise

©2014 MFMER | slide-18 A Sample Report This data has been created for example purposes only and does not reflect real Mayo Clinic data.

©2014 MFMER | slide-19 Components of the Report Generic information from NSQIP Data collection and risk-adjustment information Specialty specific information Without risk adjustment Example research articles using NSQIP data Recommendations to practice

©2014 MFMER | slide-20 Demographic Data Example* *Data is fictitious and does not reflect Mayo Clinic performance.

©2014 MFMER | slide-21 Volume Analysis by CPT Code Example* *Data is fictitious and does not reflect Mayo Clinic performance.

©2014 MFMER | slide-22 Occurrences: Raw Rates Example* *Data is fictitious and does not reflect Mayo Clinic performance.

©2014 MFMER | slide-23 Case Volume versus Occurrence Volume* *Data is fictitious and does not reflect Mayo Clinic performance.

©2014 MFMER | slide-24 Rankings Example* *Data is fictitious and does not reflect Mayo Clinic performance.

©2014 MFMER | slide-25 Recommendations and Next Steps Based on the data, Outcome #1 and Outcome #3 should be addressed. CPT Code Category #2 shows opportunity for improvement Additional detailed information by request All case data pulled for specific CPT code categories and/or cases with specific outcomes Logistic regression analysis to recognize factors that contribute certain outcomes

©2014 MFMER | slide-26 Current State and Next Steps

©2014 MFMER | slide-27 Where are we now? Stable report that practices find helpful Worked through 5 stages of data denial Creating automated process to provide information in a timely manner Creating processes for follow-up and practice accountability

©2014 MFMER | slide-28 Summary All databases are not created equal Use the correct type of database depending on the information needed Address data denial De-emphasize rankings Create a report that magnifies the “hot spots” for practices to launch quality projects

©2014 MFMER | slide-29 Questions & Discussion