Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.

Slides:



Advertisements
Similar presentations
HSAG Performance Improvement Projects Using Data to Develop Interventions and Statistical Testing to Evaluate Results Breakout Session #1 Florida EQR.
Advertisements

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
MCO Participation in External Quality Review Mandatory Activities During the SMMC Transition October 30, 2013 Presenter: Mary Wiley, BSW, RN, M.Ed. Project.
Title: The title should accurately describe the issue to be addressed and can never contain a proposed countermeasure. Overview Note: Remember that the.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University Components of health services research projects.
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. February 14, 2008.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
Chapter 15 Evaluation.
How to Write Goals and Objectives
Standards and Guidelines for Quality Assurance in the European
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Quality Improvement Prepeared By Dr: Manal Moussa.
Nursing Process- Evaluation. Evaluation Evaluation measures the client’s response to nursing actions and progress toward achieving health care goals.
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 7 Sampling, Significance Levels, and Hypothesis Testing Three scientific traditions critical.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
Performance Measurement and Analysis for Health Organizations
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Lect 6 chapter 3 Research Methodology.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. January 11, 2011.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 7 Sampling, Significance Levels, and Hypothesis Testing Three scientific traditions.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Slide 1 HMO/PSN Quarterly Meeting Performance Improvement Projects September 26, 2011 Presenter: Christi Melendez, RN, CPHQ Project Manager, PIP Review.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Program Evaluation Principles and Applications PAS 2010.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Onsite Quarterly Meeting PMHP Collaborative PIP Follow-up Within Seven Days After Acute Care Discharge for a Mental Health Diagnosis January 11, 2012 Presenter:
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
All health care professionals must understand and use the EBP approach to practice Incorporates expertise of clinician and patient’s values and preferences.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Performance Improvement Projects: From Idea to PIP
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team

CMS PIP Protocol Changes Activities III, IV, VII, and VIII have been reversed in order.  Activity III: Use a Representative and Generalizable Study Population  Activity IV: Select the Study Indicator(s)  Activity VII: Data Analysis and Interpretation of Results  Activity VIII: Improvement Strategies

Activity I: Choose the Study Topic HSAG Evaluation Elements: The Study Topic 1.Is selected following collection and analysis of data (critical element) 2.Has the potential to affect member health, outcomes of care, functional status, or satisfaction

Activity II: State the Study Question HSAG Evaluation Elements: The Study Question 1.States the problem to be studied in simple terms and is in the recommended X/Y format (critical element)

HSAG Evaluation Elements: The Study Population 1.Is accurately and completely defined and captures all members to whom the study question applies (critical element) Activity III: Identify the Study Population

Activity IV: Select the Study Indicator HSAG Evaluation Elements: The Study Indicator 1.Is well-defined, objective, and measures changes in health or functional status, consumer satisfaction, or valid process alternatives (critical element) 2.Includes the basis on which the indicator was adopted, if internally developed 3.Allows for the study question to be answered (critical element)

HSAG Evaluation Elements: Sampling Techniques 1.Specify the measurement period for the sampling methods used 2.Provide the title of the applicable study indicator 3.Identify the population size 4.Identify the sample size (critical element) 5.Specify the margin of error and confidence level 6.Describe in detail the methods used to select the sample * Activity V is only scored if sampling techniques were used. Activity V: Use Valid Sampling Techniques*

HSAG Evaluation Elements: Data Collection The data collection procedures : 1.Identify the data elements to be collected 2.Include a defined and systematic process for collecting baseline and remeasurement data Activity VI: Define Data Collection

HSAG Evaluation Elements: Data Collection The manual data collection procedures: 3.Include the qualifications of staff member(s) collecting manual data 4.Include a manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications (critical element) Activity VI: Define Data Collection

HSAG Evaluation Elements: Data Collection The administrative data collection procedures: 5.Include an estimated degree of administrative data completeness 6.Describe the data analysis plan

HSAG Evaluation Elements: Data Analysis 1.Is conducted according to the data analysis plan in the study design 2.Allows for the generalization of results to the study population if a sample was selected (critical element) 3.Identifies factors that threaten internal or external validity of findings 4.Includes an interpretation of findings *Evaluation Elements 1-5, in Activity VII, are scored for PIPs that provide baseline data Activity VII: Analyze Data and Interpret Study Results

HSAG Evaluation Elements: Interpretation of Study Results 5.Is presented in a way that provides accurate, clear, easily understood information (critical element) 6.Identifies the initial measurement and the remeasurement of study indicators 7.Identifies statistical differences between the initial measurement and the remeasurement 8.Identifies factors that affect the ability to compare the initial measurement with the remeasurement 9.Includes an interpretation of the extent to which the study was successful Activity VII: Analyze Data and Interpret Study Results

Activity VIII: Implementing Interventions and Improvement Strategies HSAG Evaluation Elements: Improvement Strategies 1.Are related to causes/barriers identified through data analysis and quality improvement processes (critical element) 2.Are system changes that are likely to induce permanent change 3.Are revised if the original interventions are not successful 4.Are standardized and monitored if interventions are successful

Activity IX: Real Improvement* HSAG Evaluation Elements: Report Improvement 1.The remeasurement methodology is the same as the baseline methodology 2.There is documented improvement in processes or outcomes of care 3.There is statistical evidence that observed improvement is true improvement over baseline (critical element) 4.The improvement appears to be the result of planned intervention(s) * Activity IX is scored when the PIP has progressed to Remeasurement 1 and will be scored on an annual basis until statistically significant improvement is achieved from baseline to a subsequent remeasurement for all study indicators. Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP.

HSAG Evaluation Elements: Sustained Improvement 1.Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant (critical element) * HSAG will not validate Activity X until statistically significant improvement has been achieved across all study indicators. Once statistically significant improvement is achieved, the MCO will need to demonstrate a subsequent remeasurement period demonstrating that they sustained that improvement to receive an overall Met validation status. Activity X: Sustained Improvement*

Old Tool Format  10 Activities  53 Evaluation Elements  Activity VII Interventions  Activity VIII Data Analysis  13 Critical Elements New Tool Format  10 Activities  37 Evaluation Elements  Activity III Study Population  Activity IV Study Indicator(s)  Activity VII Data Analysis  Activity VIII Interventions  12 Critical Elements PIP Tool Format

Outcome Focused PIP Scoring HSAG Evaluation Tool  37 Evaluation Elements Total  12 Critical Elements (CE)  Activity I:  Activity II:  Activity III:  Activity IV:  Activity V:  Activity VI:  Activity VII:  Activity VIII:  Activity IX:  Activity X: 1 CE 2 CE 1 CE 2 CE 1 CE

Outcome Focused PIP Scoring Changes  Activity VII -Evaluation Element 5 is critical -MCOs should ensure that data reported in all PIPs are accurate and align with what has been reported in its IDSS.  Activity IX -Evaluation Elements 3 and 4 have been reversed -New criteria for scoring Activity IX  Activity X -New criteria for scoring Activity X

HSAG Evaluation Elements: Assessing for Real Improvement 1.The remeasurement methodology is the same as the baseline methodology 2.There is documented improvement in processes or outcomes of care 3.There is statistical evidence that observed improvement is true improvement over baseline and across all study indicators 4.The improvement appears to be the result of planned intervention(s) Activity IX: Outcome Focused PIP Scoring

HSAG Evaluation Elements: Assessing for Sustained Improvement 1.Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant across all study indicators Activity X: Outcome Focused PIP Scoring

Outcome Focused PIP Scoring  Activity IX –Repeated measurement of the indicators demonstrates meaningful change in performance –Improvement must be statistically significant for all study indicators to receive an overall Met validation status –Is scored on an annual basis until statistically significant improvement over baseline has been achieved for all study indicators –Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP –Evaluation elements 3 and 4 are linked

Outcome Focused PIP Scoring  Activity X –Repeated measurement of the indicators demonstrates sustained improvement –HSAG will not validate Activity X until Evaluation Element 3 of Activity IX is Met –Once statistically significant improvement has been achieved for all indicators, the MCO will need to document a subsequent measurement period demonstrating sustained improvement in order to receive a Met in Activity X

Outcome Focused PIP Rationale  Overall Met Validation Status –The changes align the actual outcomes of the project with the overall validation status –Emphasis on statistically significant, sustained improvement in outcomes

Critical Analysis HSAG will be evaluating whether or not…  A current causal/barrier analysis was completed- MCOs should conduct an annual causal/barrier and drill-down analysis in addition to periodic analyses of their most recent data. MCOs should include the updated causal/barrier analysis outcomes in its PIPs.

Critical Analysis HSAG will be evaluating whether or not…  Barriers and interventions were relevant to the focus of the study and can impact the study indicator(s) outcomes

Critical Analysis For any intervention implemented, the MCO should have a process in place to evaluate the efficacy of the intervention to determine if it is having the desired effect. This evaluation process should be detailed in the PIP documentation. If the interventions are not having the desired effect, the MCO should discuss how it will be addressing these deficiencies and what changes will be made to its improvement strategies.

Critical Analysis The MCO should ensure that the intervention(s) implemented will impact the study indicator(s) outcomes.  Member-focused interventions will not impact a study indicator measuring the quality of service provided by a PCP- WCC HEDIS Measure (Childhood Obesity PIP)  Interventions focused on educating MCO staff on HEDIS measures will not impact members accessing care and seeking well-child visits

Critical Analysis The MCO should be cognizant of the timing of interventions. Interventions implemented in the last few months of the year will not have been in place long enough to have an impact on the results.

Questions and Answers