Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, 2007 10:30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,

Slides:



Advertisements
Similar presentations
HSAG Performance Improvement Projects Using Data to Develop Interventions and Statistical Testing to Evaluate Results Breakout Session #1 Florida EQR.
Advertisements

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Quality Management Branch Cady Clark, MSN, RN Branch Manager Claudia Himes-Crayton, BSN, RN Patricia Palm, MS, RNC Nurse Consultants.
Participation Requirements for a Guideline Panel Member.
Participation Requirements for a Guideline Panel Co-Chair.
Participation Requirements for a Patient Representative.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
Participation Requirements for a Guideline Panel PGIN Representative.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. February 14, 2008.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
Behavioral Health Concepts, Inc.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Evaluation. Practical Evaluation Michael Quinn Patton.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
Quality Improvement Prepeared By Dr: Manal Moussa.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
Performance Measurement and Analysis for Health Organizations
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Fiscal Year (FY) 2015 National Training and Technical Assistance Cooperative Agreements (NCA) Funding Opportunity Announcement (FOA) HRSA Objective.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Looking at our School—LAOS School Development Planning Initiative.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group March 28, :00.
NHDP Quarterly Meeting Performance Improvement Projects September 30th, 2010 Presenters: Don Grostic, MS Associate Director, Research and Analysis Yolanda.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. January 11, 2011.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Introducing Assessment Tools. What is an assessment tool? The instrument/s and procedures used to gather and interpret evidence of competence: –Instrument.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Program Evaluation Principles and Applications PAS 2010.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Onsite Quarterly Meeting PMHP Collaborative PIP Follow-up Within Seven Days After Acute Care Discharge for a Mental Health Diagnosis January 11, 2012 Presenter:
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
NHDP Plans Quarterly Meeting Performance Improvement Projects September 13, 2011 Presenter: Don Grostic, MS Associate Director, State and Corporate Services.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Cindy Tumbarello, RN, MSN, DHA September 22, 2011.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
Understanding Standards: Nominee Training Event
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Summary of PMHP Performance Improvement Projects Validation Findings
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Performance Improvement Projects: From Idea to PIP
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation

Presentation Outline PIP Overall Comments Aggregate MCO PIP Findings Aggregate PMHP Specific Findings Technical Assistance with Group Activities –Study Design –Study Implementation –Quality Outcomes Achieved Questions and Answers

Key PIP Strategies 1.Conduct outcome-oriented projects 2.Achieve demonstrable improvement 3.Sustain improvement 4.Correct systemic problems

Validity and Reliability of PIP Results Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results: –Met = Confidence/High confidence in reported PIP results –Partially Met = Low confidence in reported PIP results –Not Met = Reported PIP results not credible

Summary of PIP Validation Scores

Proportion of PIPs Meeting the Requirements for Each Activity

Aggregate Valid Percent Met IIIIIIIV V VIVIIVIIIIXX

PMHP Specific Findings 8 PIPs submitted Scores ranged from 37% to 89% Average score was 77% Assessed evaluation elements were scored as Met 78% of the time

Summary of PMHP Validation Scores

Study Design Four Components: 1.Activity I. Selecting an Appropriate Study Topic 2.Activity II. Presenting Clearly Defined, Answerable Study Question(s) 3.Activity III. Documenting Clearly Defined Study Indicator(s) 4.Activity IV. Stating a Correctly Identified Study Population

Activity I. Selecting an Appropriate Study Topic - PMHP Overall Score

Activity I. Selecting an Appropriate Study Topic Results: 92 percent of the six evaluation elements were Met 8 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed

Activity I: Review the Selected Study Topic HSAG Evaluation Elements: Reflects high-volume or high-risk conditions (or was selected by the State). Is selected following collection and analysis of data (or was selected by the State). Addresses a broad spectrum of care and services (or was selected by the State). Includes all eligible populations that meet the study criteria. Does not exclude members with special health care needs. Has the potential to affect member health, functional status, or satisfaction. Bolded evaluation elements show areas for improvement

Activity II. Presenting Clearly Defined, Answerable Study Question(s) - PMHP Overall Score No PIP studies received a Met for both evaluation elements

Activity II. Presenting Clearly Defined, Answerable Study Question(s) Results: 0 percent of the two evaluation elements were Met 100 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed

Activity II: Review the Study Question(s) HSAG Evaluation Elements: States the problem to be studied in simple terms. Is answerable. Bolded evaluation elements show areas for improvement

Activity III. Documenting Clearly Defined Study Indicator(s) - PMHP Overall Score

Activity III. Documenting Clearly Defined Study Indicator(s) Results: 59 percent of the seven evaluation elements were Met 21 percent were Partially Met or Not Met 20 percent of the evaluation elements were Not Applicable or Not Assessed

Activity III: Review Selected Study Indicator(s) HSAG Evaluation Elements: Is well defined, objective, and measurable. Are based on practice guidelines, with sources identified. Allows for the study question to be answered. Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives. Have available data that can be collected on each indicator. Are nationally recognized measure such as HEDIS ®, when appropriate. Includes the basis on which each indicator was adopted, if internally developed. Bolded evaluation elements show areas for improvement

Activity IV. Stating a Correctly Identified Study Population - PMHP Overall Score

Activity IV. Stating a Correctly Identified Study Population Results: 58 percent of the three evaluation elements were Met 13 percent were Partially Met or Not Met 29 percent of the evaluation elements were Not Applicable or Not Assessed

Activity IV: Review the Identified Study Population HSAG Evaluation Elements: Is accurately and completely defined. Includes requirements for the length of a member’s enrollment in the managed care plan. Captures all members to whom the study question applies. Bolded evaluation elements show areas for improvement

Group Activity

Study Implementation Three Components: 1.Activity V. Valid Sampling Techniques 2.Activity VI. Accurate/Complete Data Collection 3.Activity VII. Appropriate Improvement Strategies

Activity V. Presenting a Valid Sampling Technique - PMHP Overall Score

Activity V. Presenting a Valid Sampling Technique Results: 3 out of the 8 PIP studies used sampling. 38 percent of the six evaluation elements were Met. 0 percent were Partially Met or Not Met. 63 percent of the evaluation elements were Not Applicable or Not Assessed.

Activity V: Review Sampling Methods * This section is only validated if sampling is used. HSAG Evaluation Elements : Consider and specify the true or estimated frequency of occurrence. (N=8) Identify the sample size. (N=8) Specify the confidence level to be used. (N=8) Specify the acceptable margin of error. (N=8) Ensure a representative sample of the eligible population. (N=8) Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. (N=8)

Populations or Samples? Generally, –Administrative data uses populations –Hybrid (chart abstraction) method uses samples identified through administrative data

Activity VI. Specifying Accurate/Complete Data Collection - PMHP Overall Score

Activity VI. Specifying Accurate/Complete Data Collection Results: 55 percent of the eleven evaluation elements were Met 10 percent were Partially Met or Not Met 35 percent of the evaluation elements were Not Applicable or Not Assessed

Activity VI: Review Data Collection Procedures HSAG Evaluation Elements: Clearly defined data elements to be collected. (55 percent Met) Clearly identified sources of data. (77 percent Met) A clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected. (62 percent Met) A timeline for the collection of baseline and remeasurement data. (57 percent Met) Qualified staff and personnel to collect manual data. (13 percent Met; 77 percent N/A) A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications. (13 percent Met; 77 percent N/A) Bolded evaluation elements show areas for improvement

Activity VI: Review Data Collection Procedures (cont.) HSAG Evaluation Elements: A manual data collection tool that supports interrater reliability. (13 percent Met; 77 percent N/A) Clear and concise written instructions for completing the manual data collection tool. (13 percent Met; 77 percent N/A) An overview of the study in the written instructions. Administrative data collection algorithms that show steps in the production of indicators An estimated degree of automated data completeness (important if using the administrative method). Bolded evaluation elements show areas for improvement

Where do we look for our sources of data?

Baseline Data Sources Medical Records Administrative claims/encounter data Hybrid HEDIS Survey Data MCO program data Other

Activity VII. Documenting the Appropriate Improvement Strategies - PMHP Overall Score

Activity VII. Documenting the Appropriate Improvement Strategies Results: 44 percent of the four evaluation elements were Met 3 percent were Partially Met or Not Met 53 percent of the evaluation elements were Not Applicable or Not Assessed

Activity Seven: Assess Improvement Strategies HSAG Evaluation Elements: Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes. System changes that are likely to induce permanent change. Revised if original interventions are not successful. Standardized and monitored if interventions are successful. Bolded evaluation elements show areas for improvement

Determining Interventions Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population?

First Do A Barrier Analysis What did an analysis of baseline results show ? How can we relate it to system improvement? Opportunities for improvement Determine intervention(s) Identify barriers to reaching improvement

How was intervention(s) chosen? By reviewing the literature –Evidence-based –Pros & cons –Benefits & costs Develop list of potential interventions -- what is most effective?

Types of Interventions Education Provider performance feedback Reminders & tracking systems Organizational changes Community level interventions Mass media

Choosing Interventions Balance –potential for success with ease of use –acceptability to providers & collaborators –cost considerations (direct and indirect) Feasibility –adequate resources –adequate staff and training to ensure a sustainable effort

Physician Interventions: Multifaceted Most Effective Most effective: –real-time reminders –outreach/detailing –opinion leaders –provider profiles Less effective: –educational materials (alone) –formal CME program without enabling or reinforcing strategies

Patient Interventions Educational programs –Disease-specific education booklets –Lists of questions to ask your physician –Organizing materials: flowsheets, charts, reminder cards –Screening instruments to detect complications –Direct mailing, media ads, websites

Evaluating Interventions Does it target a specific quality indicator? Is it aimed at appropriate stakeholders? Is it directed at a specific process/outcome of care or service? Did the intervention begin after baseline measurement period?

Interventions Checklist 3Analyze barriers (root causes) 3Choose & understand target audience 3Select interventions based on cost-benefit 3Track intermediate results 3Evaluate effectiveness 3Modify interventions as needed 3Re-Measure

Group Activity

Quality Outcomes Achieved Three Components: 1.Activity VIII. Presentation of Sufficient Data Analysis and Interpretation 2.Activity IX. Evidence of Real Improvement Achieved 3.Activity X. Data Supporting Sustained Improvement Achieved

Activity VIII. Presentation of Sufficient Data Analysis and Interpretation - PMHP Overall Score

Activity VIII. Presentation of Sufficient Data Analysis and Interpretation Results: 14 percent of the nine evaluation elements were Met 8 percent of the evaluation elements Partially Met or Not Met 78 percent of the evaluation elements were Not Applicable or Not Assessed

Activity VIII: Review Data Analysis and Interpretation of Study Results HSAG Evaluation Elements: Is conducted according to the data analysis plan in the study design. Allows for generalization of the results to the study population if a sample was selected. Identifies factors that threaten internal or external validity of findings. Includes an interpretation of findings. Is presented in a way that provides accurate, clear, and easily understood information. Bolded evaluation elements show areas for improvement

Activity VIII: Review Data Analysis and Interpretation of Study Results (cont.) HSAG Evaluation Elements: Identifies initial measurement and remeasurement of study indicators. Identifies statistical differences between initial measurement and remeasurement. Identifies factors that affect the ability to compare initial measurement with remeasurement. Includes the extent to which the study was successful. Bolded evaluation elements show areas for improvement

Changes in Study Design? Study design should be same as baseline Data source Data collection methods Data analysis Target population or sample size Sampling methodology If change: rationale must be specified & appropriate rationale must be specified & appropriate

Activity IX. Evidence of Real Improvement - PMHP Overall Score

Activity IX. Evidence of Real Improvement Results: 16 percent of the four evaluation elements were Met 9 percent were Partially Met or Not Met 75 percent of the evaluation elements were Not Applicable or Not Assessed

Activity IX: Assess the Likelihood that Reported Improvement is “Real” Improvement HSAG Evaluation Elements: The remeasurement methodology is the same as the baseline methodology. There is documented improvement in processes or outcomes of care. The improvement appears to be the result of intervention(s). There is statistical evidence that observed improvement is true improvement. Bolded evaluation elements show areas for improvement

Statistical Significance Testing Time Periods Measurement Periods NumeratorDenominatorRate or Results Industry Benchmark Statistical Testing and Significance CY 2003Baseline %60%N/A CY 2004Re-measurement %60%Chi-square = 2.8 P-value = NOT SIGNIFICANT AT THE 95% CONFIDENCE LEVEL

Activity X. Data Supporting Sustained Improvement Achieved - PMHP Overall Score No PMHP reached this Activity

Activity X. Data Supporting Sustained Improvement Achieved Results: 0 percent of the one evaluation element was Met 0 percent was Partially Met or Not Met 100 percent of the evaluation element was Not Applicable or Not Assessed

Activity X: Assess Whether Improvement is Sustained HSAG Evaluation Elements: Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant. Bolded evaluation elements show areas for improvement

Quality Outcomes Achieved Baseline 1st Yr Demonstrable Improvement Sustained Improvement

Modifications in interventions Changes in study design Improvement sustained for 1 year

HSAG Contact Information Cheryl Neel, RN, MPH,CPHQ Manager, Performance Improvement Projects Denise Driscoll Administrative Assistant

Questions and Answers