Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, 2006 2:45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.

Slides:



Advertisements
Similar presentations
Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
Advertisements

WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Quality Management Branch Cady Clark, MSN, RN Branch Manager Claudia Himes-Crayton, BSN, RN Patricia Palm, MS, RNC Nurse Consultants.
MCO Participation in External Quality Review Mandatory Activities During the SMMC Transition October 30, 2013 Presenter: Mary Wiley, BSW, RN, M.Ed. Project.
Nan Jeannero, Kerry McGuire Phoenix State of Connecticut, Department of Social Services Mystery Shopper Project November 2006.
4/30/20151 Quality Assurance Overview. 4/30/20152 Quality Assurance System Overview FY 04/05- new Quality Assurance tools implemented, taking into consideration.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
Encounter Data Validation: Review and Project Update
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
How to Write Goals and Objectives
HEDIS Audit – Appropriate Monitoring and Oversight of Vendors Presenter: Yolanda Strozier, MBA Project Manager, EQRO Services.
Quality Improvement Prepeared By Dr: Manal Moussa.
Slide 1 Plan−Do−Study−Act! Using the PDSA Cycle to Improve Your Performance Improvement Projects March 18, 2014 Presenter: Christi Melendez, RN, CPHQ Associate.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
Performance Measurement and Analysis for Health Organizations
WELCOME! External Quality Review Quarterly Meeting Wednesday, June 18, :00 a.m. – 3:30 p.m. WELCOME!
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Evaluating a Research Report
Preliminary Results of the Special Health Care Needs Focused Study Tuesday, June 19, :15 p.m. –1:45 p.m. David Mabb, MS, CHCA Sr. Director, Statistical.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
External Quality Review Quarterly Meeting Wednesday, September 23, :00 p.m. – 3:30 p.m. WELCOME!
Quality Strategy June 10, 2009 Presented by: Debby McNamara, LCSW, PMP Quality Coordinator, AHCA.
Page 1 External Quality Review Quarterly Meeting Monday, March 21, :00 p.m. –2:30 p.m. WELCOME!
Upcoming EQR Activities Contract Year Two 1:45 p.m. – 2:30 p.m. Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group March 28, :00.
NHDP Quarterly Meeting Performance Improvement Projects September 30th, 2010 Presenters: Don Grostic, MS Associate Director, Research and Analysis Yolanda.
Validation of Performance Measures for PMHPs Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group.
Improvement Planning Mischele McManus Infant/Toddler and Family Services Office of Early Childhood Education and Family Services July 20, 2007
Results of the Adolescent Well-Care Focused Study Tuesday, June 19, :15 p.m. –12:00 p.m. David Mabb, MS, CHCA Sr. Director, Statistical Evaluation.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Special Health Care Needs Focused Study Update 12:45 p.m. – 1:15 p.m. Presenter: David Mabb, MS, CHCA Senior Director, Statistical Evaluation.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Adolescent Well-Care Focused Study Update 1:15 p.m. – 1:30 p.m. Presenter: Granville Prince, BS, ALHC, FLMI, ACS, RHU Project Manager, EQRO.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
Quality Improvement Tools for Intervention Determination Presenters: Kris Hartmann, MS Healthcare Analyst, Performance Improvement Projects Don Grostic,
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Summary of PMHP Performance Improvement Projects Validation Findings
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Measuring Data Quality and Compilation of Metadata
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
June 21, 2018 Amy McCurry Schwartz, Esq., MHSA EQRO Consultant
Performance Improvement Projects: PIP Library
Performance Measures 101 March 30, 2012 Presenter:
Summary of NHDP Performance Improvement Projects Validation Findings
Performance Improvement Projects: From Idea to PIP
Presenter: Kate Bell, MA PIP Reviewer
Performance Improvement Projects Technical Assistance
New Special Education Teacher Webinar Series
Institutional Self Evaluation Report Team Training
Presentation transcript:

Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA Senior Director of Statistical Evaluation Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects

Presentation Outline Overview of the PIP process PIP Summary Form Review –MCO demographics –CMS rationale –HSAG evaluation elements PIP Scoring Methodology Deliverables through the end of 2006 Questions and Answers

Overview of PIPs What is a PIP? It is a quality improvement project. What is the purpose of a PIP? To assess and improve processes, and subsequently, outcomes of care. It typically consists of a baseline, intervention period(s), and remeasurement (s).

Overview of PIPs (cont.) The PIP process provides an opportunity to: Identify and measure a targeted area (clinical or nonclinical) Analyze the results Implement interventions for improvement

Overview of PIPs (cont.) Useful PIP “SIDE EFFECTS” Develop a framework for future performance improvement projects May improve member satisfaction Improve HEDIS rates as a “bonus” to improving other health outcomes

Overview of PIPs (cont.) HSAG’s role: Validates PIPs using CMS’ protocol, Validating Performance Improvement Projects, A protocol for use in Conducting Medicaid External Quality Review Activities, Final Protocol, Version 1.0. PIP Validation is a desk audit evaluation HSAG validates the study’s findings on the likely validity and reliability of the results Provides PIP Validation Reports to AHCA and the MCOs Identify best practices

PIP Summary Form Review Health plan demographics (first page of the submission form) Discuss the 10 PIP Activities –CMS Rationale –HSAG evaluation elements

A.Activity One: Choose the Selected Study Topic CMS Rationale Impacts a significant portion of the members. Reflects Medicaid enrollment in terms of demographic characteristics, prevalence of disease, and the potential consequences (risks) of the disease.

A.Activity One: Choose the Selected Study Topic CMS Rationale Addresses the need for a specific service. Goal should be to improve processes and outcomes of health care. The study topic may be specified by the State Medicaid agency or on the basis of Medicaid enrollee input.

A.Activity One: Choose the Selected Study Topic HSAG Evaluation Elements Reflects high-volume or high-risk conditions (or was selected by the State). Is selected following collection and analysis of data (or was selected by the State). Addresses a broad spectrum of care and services (or was selected by the State).

A.Activity One: Choose the Selected Study Topic HSAG Evaluation Elements (cont.) Includes all Medicaid eligible populations that meet the study criteria. Includes members with special health care needs. Has the potential to affect member health, functional status, or satisfaction.

A.Activity One: Choose the Selected Study Topic Example Study Topics: Cervical Cancer Screening HbA1c testing Flu Vaccinations Initial Contact Data Systems Early Periodic Screening Detection and Treatment (EPSDT) Services for Children 1–3 Years of Age Early Entrance to Prenatal Care and Check Ups after Delivery

B. Activity Two: The Study Question CMS Rationale Stating the question(s) helps maintain the focus of the PIP and sets the framework for data collection, analysis, and interpretation.

B. Activity Two: The Study Question HSAG Evaluation Elements States the problem to be studied in simple terms. Is answerable/provable. In general, the question should illustrate the point of: Does doing X result in Y? Example: Will increased planning and attention to the importance of follow-up after inpatient discharge improve the rate of members receiving follow-up services?

C. Activity Three: Selected Study Indicators CMS Rationale Quantitative or qualitative characteristic. Discrete event (member has or has not had XX). Appropriate for the study topic. Objective, clearly and unambiguously defined.

C. Activity Three: Selected Study Indicators HSAG Evaluation Elements The study indicator(s): Is well defined, objective, and measurable. Is based on practice guidelines, with sources identified.

C. Activity Three: Selected Study Indicators HSAG Evaluation Elements (cont.) The study indicator(s): Allows for the study question/hypothesis to be answered or proven. Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives.

C. Activity Three: Selected Study Indicators HSAG Evaluation Elements (cont.) The study indicator(s): Has available data that can be collected on each indicator. Is a nationally recognized measure such as HEDIS ®, when appropriate. Includes the basis on which each indicator was adopted, if internally developed. HEDIS ® is a registered trademark of the National Committee for Quality Assurance (NCQA).

D. Activity Four: Identified Study Population CMS Rationale Represents the entire Medicaid eligible enrolled population. Allows system-wide measurement. Implements improvement efforts to which the study indicators apply.

HSAG Evaluation Elements The method for identifying the eligible population: Is accurately and completely defined. Includes requirements for the length of a member’s enrollment in the managed care plan. Captures all members to whom the study question applies. D. Activity Four: Identified Study Population

Example of Study Population: All Medicaid children with at least 11 months (12 months with one 30-day gap of enrollment) of continuous enrollment in the health plan, who were born on or between January 1, 2001, and December 31, D. Activity Four: Identified Study Population

CMS Rationale Sample size impacts the level of statistical confidence in the study. -Statistical confidence is a numerical statement of the probable degree of certainty or accuracy of an estimate. Reflects improvement efforts to which the study indicators apply. Reflects the entire population or a sample of that population. E. Activity Five: Valid Sampling Techniques

HSAG Evaluation Elements Consider and specify the true or estimated frequency of occurrence (or the number of eligible members in the population). Identify the sample size (or use the entire population). Specify the confidence interval to be used (or use the entire population). E. Activity Five: Valid Sampling Techniques

HSAG Evaluation Elements (cont.) Specify the acceptable margin of error (or use the entire population). Ensure a representative sample of the eligible population. Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. E. Activity Five: Valid Sampling Techniques

CMS Rationale Automated data collection Manual data collection Inter-rater reliability Frequency of collection and analysis cycle F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

HSAG Evaluation Elements The data collection techniques: Provide clearly defined data elements to be collected. Clearly specify sources of data. Provide for a clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

HSAG Evaluation Elements (cont.) The data collection techniques Provide for a timeline for the collection of baseline and remeasurement data. Provide for qualified staff and personnel to collect manual data. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

HSAG Evaluation Elements (cont.) The manual data collection tool: Ensures consistent and accurate collection of data according to indicator specifications. Supports inter-rater reliability. Has clear and concise written instructions for completion. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

HSAG Evaluation Elements (cont.) An overview of the study in the written manual data collection tool instructions. Automated data collection algorithms that show steps in the production of indicators. An estimated degree of automated data completeness (important if using the administrative method).

F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis Other Considerations: Determine time period of data to be abstracted (i.e.) – From birth to end of study period – Only EPSDT services provided during study period – All visits or EPSDT visits only? – Calendar or Fiscal Year?

Other Considerations (cont.): Administrative method or hybrid method Design and test data collection tool Electronic or hard copy tools? Enter data into database on laptop Keypunch data from hard copy F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

Other Considerations (cont.): Design tool instructions Clearly defines study indicators Attempts to cover a variety of medical record scenarios Supports reliability F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

Other Considerations (cont.): Design tool instructions Clearly defines study indicators Attempts to cover a variety of medical record scenarios Supports reliability F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

Other Considerations (cont.): On-site vs. mail-in data collection? Allow 10 to 12 weeks for medical record retrieval and abstraction. Health plans should conduct and document ongoing monitoring of abstractors. HSAG recommends 5% per abstractor using “rater to standard” method. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

Impact of Missing Medical Records on Results MR The Case of the Missing Records MR

Comparison Between Immunization Results With and Without Missing Records Included HEDIS Combined Rate DTP,OPV, MMR, HIB, & HBV With Missing MRs Without Missing MRs 90% Goal

Other Considerations (cont.): Create an Analysis Plan Determine the statistics to be reported Report rates for indicators along with significance Example: Lead screening rates may be reported at 24 months of age, and then also for children by 35 months of age. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

Other Considerations (cont.): Examine indicators by categories such as age and gender, and overall Analyze necessary referrals Example: Children with elevated blood lead levels require additional follow-up. F. Activity Six: Data Collection Procedures, Data Collection Cycle, and Data Analysis

CMS Rationale An intervention designed to change behavior at all levels of the care delivery system, including the members. Changing performance, according to predefined quality indicators. Appropriate interventions. Likelihood of effecting measurable change. G. Activity Seven: Improvement Strategies

HSAG Evaluation Elements Planned/implemented strategies for improvement are: Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes. System changes that are likely to induce permanent change. Revised if original interventions are not successful. Standardized and monitored if interventions are successful. G. Activity Seven: Improvement Strategies

HSAG Evaluation Elements (cont.) Planned/implemented strategies for improvement are: May be at the health plan, provider, or member level Should be realistic, feasible, and clearly defined Need a reasonable amount of time to be effective G. Activity Seven: Improvement Strategies

Examples of EPSDT Improvement Strategies: Sharing member-level results with providers Mailing out reminder postcards to members and providers Developing an intervention tool kit that contains clinical guidelines, tracking forms, wall charts, and other provider office tools G. Activity Seven: Improvement Strategies

CMS Rationale Initiated using statistical analysis techniques. Included an interpretation of the extent to which the study was successful. H. Activity Eight: Data Analysis and Interpretation of Study Results

HSAG Evaluation Elements The data analysis: Is conducted according to the data analysis plan in the study design. Allows for generalization of the results to the study population if a sample was selected. Identifies factors that threaten internal or external validity of findings. Includes an interpretation of findings. H. Activity Eight: Data Analysis and Interpretation of Study Results

HSAG Evaluation Elements (cont.) The data analysis: Is presented in a way that provides accurate, clear, and easily understood information. Identifies initial measurement and remeasurement of study indicators. Identifies statistical differences between initial measurement and remeasurement. Identifies factors that affect the ability to compare initial measurement with remeasurement. Includes the extent to which the study was successful. H. Activity Eight: Data Analysis and Interpretation of Study Results

CMS Rationale Probability that improvement is true improvement. Included an interpretation of the extent to which any changes in performance is statistically significant. I. Activity Nine: Study Results and Summary Improvement

HSAG Evaluation Elements The remeasurement methodology is the same as the baseline methodology. There is documented improvement in processes or outcomes of care. The improvement appears to be the result of intervention(s). There is statistical evidence that observed improvement is true improvement.

J. Activity Ten: Sustained Improvement CMS Rationale Change results from modifications in the processes of health care delivery. If real change has occurred, the project should be able to achieve sustained improvement.

HSAG Evaluation Elements Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant. J. Activity Ten: Sustained Improvement

PIP Scoring Methodology HSAG Evaluation Tool 13 Critical Elements 53 Evaluation Elements (including the Critical Elements)

PIP Scoring Methodology Overall PIP Score Percentage Score: Calculated by dividing the total Met by the sum of the total Met, Partially Met, and Not Met. Percentage Score of Critical Elements: Calculated by dividing the total critical elements Met by the sum of the critical elements Met, Partially Met, and Not Met. Validation Status: Met, Partially Met, Not Met

PIP Scoring Methodology Met (1) All critical elements were Met, and (2) 80%–100% of all elements were Met across all activities.

PIP Scoring Methodology Partially Met (1)All critical elements were Met, and 60% to 79% of all elements were Met across all activities; or (2)One or more critical element(s) were Partially Met.

PIP Scoring Methodology Not Met (1)All critical elements were Met and <60% of all elements were Met across all activities; or (1)One or more critical element(s) were Not Met.

PIP Scoring Methodology Not Applicable (NA) NA elements (including critical elements) were removed from all scoring. Not Assessed Not Assessed elements (including critical elements) were removed from all scoring.

PIP Scoring Methodology Example 1 Met = 43, Partially Met = 2, Not Met = 0, NA = 8, and all critical elements were Met. The MCO receives an overall Met status, indicating the PIP is valid. The score for the MCO is calculated as 43/45 = 95.6 percent. No further action is required.

PIP Scoring Methodology Example 2 Met = 52, Partially Met = 0, Not Met = 1, NA = 0, and one critical element was Not Met. The MCO receives an overall Not Met status and the PIP is not valid. The MCO will need to revise the PIP and resubmit, or send in appropriate information to resolve the issue with the critical element.

PIP Tips 1.Complete the demographic page before submission. 2.Notify HSAG when the PIP documents are uploaded to the secure ftp site and state the number of documents uploaded. 3. Label ALL attachments and reference them in the body of the PIP study. 4. HSAG does not require personal health information to be submitted. Submit only aggregate results. 5. Document, document, and document!! 6. Look for a Q & A section on the website. If you have additional questions, contact HSAG.

Deliverables June 27th: PIP Training August:Complete Statement of Intent and Technical Assistance Survey October:MCOs notified electronically of submission date with instructions November:Submit PIP studies to HSAG

Questions and Answers