We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byKory Hubbard
Modified about 1 year ago
©2014 MFMER | slide-1 So many questions, So much data, So little time A Journey with Surgical Outcome Data Alison M. Knight, P.E. IIE 2014 National Conference
©2014 MFMER | slide-2 Agenda Tutorial in quality databases Background information on project Strategy and process for report creation Example of report and metrics chosen
©2014 MFMER | slide-3 Quick Tutorial in Surgical Quality Databases The dirty secrets THEY don’t tell you
©2014 MFMER | slide-4 Lesson #1: Quality databases are NOT created equal Clinical Abstracted vs Administrative Data Outcome Definition Data Audit ProcessSample vs. Population Database Attributes
©2014 MFMER | slide-5 Lesson #2: Use all databases, but choose metrics carefully Mortality & Length of Stay Administrative Data & All Cases Surgical Site Infection & Urinary Tract Infection Clinically Abstracted Data & Sample of Cases Example #1: Example #2:
©2014 MFMER | slide-6 Lesson #3: Data denial will happen, Be ready! Five Stages of Data Denial* 1.The data is wrong 2.The data is old 3.We have changed already 4.This is a different place 5.We tried that already *Borrowed from Dr. Brian Postl, CEO of the Winnipeg Regional Health Authority Presentation
©2014 MFMER | slide-7 Let’s set the stage
©2014 MFMER | slide-8 The surgical practice is frustrated Each surgical specialty pays a portion New costs Most surgical specialties believed this quality data provided little value No perceived value Surgical specialties didn’t believe the data Data is not trustworthy
©2014 MFMER | slide-9 Some background facts…
©2014 MFMER | slide-10 What is NSQIP? American College of Surgeons (ACS)’s National Surgical Quality Improvement Program History Started in the Veterans Administration(1994)* Open to other hospitals (2004)* Mayo Clinic joined in 2006 Clinically abstracted random sample of cases *ACS NSQIP website:
©2014 MFMER | slide-11 What types of information is included and how is it collected? Uniform Operational Definitions 135 Patient Variables 30 day Post-op Assessment Audit Process
©2014 MFMER | slide-12 Data Delivery to the Practice Twice a year the practice chair received a “canned report” from NSQIP Each specialty had risk-adjusted odds ratios Ranged in granularity depending on specialty Highlighted the specialty’s decile rankings (1-10) Decile*: divides the distribution of the variable into ten groups having equal frequencies. *Source: dictionary.com
©2014 MFMER | slide-13 The Project
©2014 MFMER | slide-14 Project Goals Educate Educate providers on database facts Work through the stages of denial Provide Provide data at more detailed level Point providers to “hot spots” Change Aid specialty in developing data- driven quality projects
©2014 MFMER | slide-15 Identified the Challenges NSQIP costs money, value not recognized “Why should I pay for NSQIP if I am not using it?” Lack of understanding of data collection process: “Documentation errors are probably driving this data” Decile rankings are the focus “These rankings can’t possibly be right, our patients are sicker and we do more complicated procedures” Specialty practices only were recognized for negative quality “Leadership just doesn’t understand our practice”
©2014 MFMER | slide-16 The Plan to Create a Meaningful Dashboard Goals Practice Input Use in Research Address Concerns De-emphasize Rankings Focus on Good & Bad Make Data Actionable Dispel Myths
©2014 MFMER | slide-17 The process of development: PDSA-Like Cycle One specialty at a time Analysis done by hand Review report for completeness Obtain feedback from each specialty for improvements CreateReviewPresentRevise
©2014 MFMER | slide-18 A Sample Report This data has been created for example purposes only and does not reflect real Mayo Clinic data.
©2014 MFMER | slide-19 Components of the Report Generic information from NSQIP Data collection and risk-adjustment information Specialty specific information Without risk adjustment Example research articles using NSQIP data Recommendations to practice
©2014 MFMER | slide-20 Demographic Data Example* *Data is fictitious and does not reflect Mayo Clinic performance.
©2014 MFMER | slide-21 Volume Analysis by CPT Code Example* *Data is fictitious and does not reflect Mayo Clinic performance.
©2014 MFMER | slide-22 Occurrences: Raw Rates Example* *Data is fictitious and does not reflect Mayo Clinic performance.
©2014 MFMER | slide-23 Case Volume versus Occurrence Volume* *Data is fictitious and does not reflect Mayo Clinic performance.
©2014 MFMER | slide-24 Rankings Example* *Data is fictitious and does not reflect Mayo Clinic performance.
©2014 MFMER | slide-25 Recommendations and Next Steps Based on the data, Outcome #1 and Outcome #3 should be addressed. CPT Code Category #2 shows opportunity for improvement Additional detailed information by request All case data pulled for specific CPT code categories and/or cases with specific outcomes Logistic regression analysis to recognize factors that contribute certain outcomes
©2014 MFMER | slide-26 Current State and Next Steps
©2014 MFMER | slide-27 Where are we now? Stable report that practices find helpful Worked through 5 stages of data denial Creating automated process to provide information in a timely manner Creating processes for follow-up and practice accountability
©2014 MFMER | slide-28 Summary All databases are not created equal Use the correct type of database depending on the information needed Address data denial De-emphasize rankings Create a report that magnifies the “hot spots” for practices to launch quality projects
©2014 MFMER | slide-29 Questions & Discussion
EVAAS & Standard 6 Summary of NCDPI Staff Development 12/4/12.
Presenter name Presenter Organization Location and date Clinical Information Systems Adapted from Improving Chronic Illness Care
Today we will be orienting ourselves to some of the principles and qualities we look for and try to exhibit as Leaders in NA. Specifically, we will look.
Learning and Teaching Approaches Advice and Guidance for Practitioners National 5 Business Management Management of People and Finance.
The Project Cycle Management Course presented by Simon Pluess World Alliance of YMCAs.
Quality Impact Teams In the small organization Part 3 – Team Leader Training.
Data Collection and Aggregation 2/8/2014 Data Collection and Aggregation 1 Presented by AmeriCorps Program Staff and JBS International.
Understanding Private Payers & Maximizing Private Payer Reimbursement Strategies: Understanding the Process Barbara Grenell, Preferred Health Strategies.
1 Building Capacity for Resource Mobilization Improving the Financial Conditions for Implementation of the Basel Convention at the National and Regional.
Whole Faculty Group Study An Overview of WFGS Feb 5-12, 2006 Dr. Eric Jakubowski An Overview of WFGS Feb 5-12, 2006 Dr. Eric Jakubowski.
Professional Development for School Leaders Technical Assistance Phase 1 Self-Assessment and Plan Design.
The Importance of Data Analytics in Physician Practice Massachusetts Medical Society March 30, 2012 James L. Holly, MD CEO, SETMA, LLP Adjunct.
AME 4163 University of Oklahoma Understanding Customer Requirements Principles of Design Zahed Siddique Assistant Professor School of Aerospace and Mechanical.
Focused Review of a Sentinel Event Root Cause Analysis.
Donald T. Simeon Caribbean Health Research Council.
Putting PVAAS Teacher Reporting Skepticisms to Rest Dr. John White, Manager, SAS EVAAS for K-12 PVAAS Statewide Team: Kristen Lewald,Joan Perekupka and.
Presenting to the client During any stage of the sales process it’s the actor not the script! Execution.
Quality Tools and Techniques in the School and Classroom.
Walkthroughs A Vehicle to Promote Student Learning.
Competence is the demonstrated ability to apply knowledge and/or skills and, where relevant, personal attributes. A certification scheme contains.
PROGRESS REVIEWS Norman Mason. Strengths and Weaknesses Progress reviews Common inspection strengths Effective progress reviews and target-setting Good.
© The Open University, Institute of Educational Technology 1 Alison Ashby, Naomi Jeffery, Anne Slee Student Statistics and Survey Team, IET The Open University.
Project Performance Management Solution (PPMS) Solution Overview.
1 Computer Systems & Architecture Lesson 3 5. Designing the Architecture.
Learning and Teaching Approaches Advice and Guidance for Practitioners National 5 Business Management Management of Marketing and Operations.
+ I used to ___, but now I ___. ROGERS PUBLIC SCHOOLS October 2011.
Public Service Productivity Measurement: Macro or Micro? Aileen Simkins, Department of Health Co-Director of the Atkinson Review.
COUNTER: making statistics useful Peter Shepherd Director COUNTER January 2007.
Improving Students Learning Through Internships: An Outcomes-Based Approach Michael S. Miller, Dean of Student Affairs Joseph Coyne, Assistant Dean and.
April 2013 Improving governance A training resource for schools.
© 2016 SlidePlayer.com Inc. All rights reserved.