Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.

Slides:



Advertisements
Similar presentations
Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
Advertisements

Protocol Development.
Preparing Data for Quantitative Analysis
Student Learning Targets (SLT)
In Today’s Society Education = Testing Scores = Accountability Obviously, Students are held accountable, But also!  Teachers  School districts  States.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
1 QUANTITATIVE DESIGN AND ANALYSIS MARK 2048 Instructor: Armand Gervais
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. February 14, 2008.
State Assessment Observations Tressa Madden, MPH, RES CSO/Standards Implementation Staff FDA Tuesday, March 11, :00 am – 10:00 am.
External Quality Review Organization (EQRO) Kick-Off Meeting
Indicators, Data Sources, and Data Quality for TB M&E
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Behavioral Health Concepts, Inc.
Grade 12 Subject Specific Ministry Training Sessions
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
8/20/2015Slide 1 SOLVING THE PROBLEM The two-sample t-test compare the means for two groups on a single variable. the The paired t-test compares the means.
Quality Assessment 2 Quality Control.
Ch 6 Validity of Instrument
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
Technical Adequacy Session One Part Three.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Chapter Nine Copyright © 2006 McGraw-Hill/Irwin Sampling: Theory, Designs and Issues in Marketing Research.
9/23/2015Slide 1 Published reports of research usually contain a section which describes key characteristics of the sample included in the study. The “key”
Classroom Assessment A Practical Guide for Educators by Craig A
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
1 Psych 5500/6500 The t Test for a Single Group Mean (Part 1): Two-tail Tests & Confidence Intervals Fall, 2008.
New Zealand Diploma in Business National External Moderation Reports Tertiary Assessment & Moderation.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Section 10.1 Confidence Intervals
1 Performance Measures A model for understanding the behavior of our work Presented by Wendy Fraser.
Validation of Performance Measures for PMHPs Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. January 11, 2011.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
NHDP Plans Quarterly Meeting Performance Improvement Projects September 13, 2011 Presenter: Don Grostic, MS Associate Director, State and Corporate Services.
Understanding the Report Process and Research Methods Business Communication, 15e Lehman and DuFrene Business Communication, 15 th edition by Lehman and.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Writing and Submitting Student Learning Objectives
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Classroom Assessment A Practical Guide for Educators by Craig A
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
NPQonline - Final Assessment
Overview: Understanding and Building a Schoolwide Assessment Plan
Kim Miller Oregon Department of Education
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Performance Improvement Projects: From Idea to PIP
Measuring Data Quality
Presenter: Kate Bell, MA PIP Reviewer
Nonconformity Writing
Presentation transcript:

Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Overview of the Training Meeting Logistics Validation Process CMS Protocol Activities Tips and Hints Resources Questions and Answers Other Training Topics

Meeting Logistics Using GoToWebinar™ How to ask questions to team during the presentation –Through the questions section in the GoToWebinar application –You can use the GoToWebinar raise your hand option We will review submitted and additional questions at the end of the presentation

First the Basics of GoToWebinar: The GoToWebinar Attendee Interface 1. Viewer Window 2. Control Panel

Validation Process for PIP Request desk materials –All PIPs currently in progress or completed since the last EQR should be sent –Example: NCQA Activity Forms After receiving desk materials, three PIPs are selected for validation Perform CMS Protocol Activities by reviewing the desk materials we received

CMS Protocol Activities Activity One: Assessing the Study Methodology Activity Two: Verify Study Findings (optional) Activity Three: Evaluating Overall Validity and Reliability of Study Results

Activity One: Assessing the Study Methodology Ten steps with related review questions and scoring –27 components / questions and 116 possible points Specific questions can be found in the CCME PIP Validation Overview document

Activity One: Assessing the Study Methodology (cont) Each component is scored by the degree they meet the protocol requirements –MET: component fully meets the criteria without any issues –PARTIALLY MET: component meets some but not all of the criteria –NOT MET: component fails to meet most or all the criteria –NA: component does not apply to the project being reviewed

Activity One: Step One Review the Selected Study Topic(s) –How was the topic of the study selected –Is the topic appropriate 3 questions with 7 possible points

Activity One: Step Two Review the Study Question(s) –Is the study question documented Single question with 10 possible points

Activity One: Step Three Review the Selected Study Indicator(s) –Are the study indicators objective and clearly defined –Are the study indicators appropriate 2 questions with 11 possible points

Activity One: Step Four Review the Identified Study Population –Is the study population well defined –Is the population being captured correctly 2 questions with 6 possible points

Activity One: Step Five Review the Sampling Methods –Applies only if sampling was used in the project –Is a valid sampling method being used –Is the sample large enough 3 questions with 20 possible points

Activity One: Step Six Review the Data Collection Procedures –Are data sources clearly specified –Was a data analysis plan established in the documentation –Did the instruments used for data collection provide consistent, accurate data 6 questions with 18 possible points

Activity One: Step Seven Assess Improvement Strategies –Are reasonable interventions being planned and implemented Single question with 10 possible points

Activity One: Step Eight Review the Data Analysis and Interpretation of Study Results –Was the data analysis plan followed –Are numerical results presented accurately and clearly 4 questions with 17 possible points

Activity One: Step Nine Assess Whether Improvement is “Real” Improvement –Is there any documented, quantitative improvement –Was the same methodology used for baseline and repeated measurement 4 questions with 12 possible points

Activity One: Step Ten Assess Sustained Improvement –Is sustained improvement demonstrated in the repeated measurements of the study Single question with 5 possible points

Activity Two: Verify Study Findings Optional Activity Requires the plans to produce the data that generated the results of the PIP Review would try to mimic the documented results from the data received from the plan

Activity Three: Evaluating Overall Validity and Reliability of Study Results Scores are summarized Validation Finding is calculated o VF = (score project received / total possible points) o Multiply by 100 to report as a percentage Final Audit Designation is assigned

Activity Three: Evaluating Overall Validity and Reliability of Study Results (cont) Score ranges for the Final Audit Designation AUDIT DESIGNATION POSSIBILITIES High Confidence in Reported Results Little to no minor documentation problems or issues that do not lower the confidence in what the plan reports. Validation findings must be 90%–100%. Confidence in Reported Results Minor documentation or procedural problems that could impose a small bias on the results of the project. Validation findings must be 70%–89%. Low Confidence in Reported Results Plan deviated from or failed to follow their documented procedure in a way that data was misused or misreported, thus introducing major bias in results reported. Validation findings between 60%–69% are classified here. Reported Results NOT Credible Major errors that put the results of the entire project in question. Validation findings below 60% are classified here.

Tips and Hints Remember the Study Questions in your project documentation! Document the numerators and denominators along with the rates Double check rate calculations

Tips and Hints (cont) Check your terminology o Percent change is not the same as percentage point change Example: Baseline = 55% Re-measurement = 95% Percent Change = (95% - 55%) / 55% = 73% change Percentage Point Change = 95% - 55% = 40 percentage point change

Resources Go to: –Search for SC EQR –Training materials should be one of the top results CCME PIP Validation Overview document CMS PIP Protocol o “Validating Performance Improvement Projects: A protocol for use in conducting Medicaid external quality review activities”

Question and Answers

Additional Training Topics

Please Remember the Evaluation! It will display after you end the webinar.