WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.

Slides:



Advertisements
Similar presentations
HSAG Performance Improvement Projects Using Data to Develop Interventions and Statistical Testing to Evaluate Results Breakout Session #1 Florida EQR.
Advertisements

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
Maryland Higher Education Commission BRAC Higher Education Investment Fund Technical Assistance Meeting June 21, 2010.
Preparing a Work Plan BA25 Business Communications Professor Melody Thomas.
MCO Participation in External Quality Review Mandatory Activities During the SMMC Transition October 30, 2013 Presenter: Mary Wiley, BSW, RN, M.Ed. Project.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
SCIA Special Circumstances Instructional Assistance
1 Phase III: Planning Action Developing Improvement Plans.
Training for OCAN Users Day 2
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
Health Services Advisory Group, Inc. Performance Improvement Projects Nursing Home Diversion Program Where Are We Now? Concurrent Breakout Session #2 Florida.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
Research Proposal Development of research question
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
© 2014 Public Health Institute PROPOSAL WRITING.
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Performance Management Training Governor’s Office of Planning and Budget 1.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Literature Review and Parts of Proposal
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
Part I – Data Collection and Measurement Ruth S. Gubernick, MPH Quality Improvement Advisor Lori Morawski, MPH CHES Manager, Quality Improvement Programs.
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Florida Quarterly Meeting Root Cause Analysis Florida Quarterly Meeting Root Cause Analysis January 12, 2011 Christy Hormann, MSW, CPHQ Project Leader-PIPs.
Behavior Management: Applications for Teachers (5 th Ed.) Thomas J. Zirpoli Copyright © 2008 by Pearson Education, Inc. All rights reserved. 1 CHAPTER.
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
External Quality Review Quarterly Meeting Wednesday, September 23, :00 p.m. – 3:30 p.m. WELCOME!
Quality Strategy June 10, 2009 Presented by: Debby McNamara, LCSW, PMP Quality Coordinator, AHCA.
NHDP Quarterly Meeting Performance Improvement Projects September 30th, 2010 Presenters: Don Grostic, MS Associate Director, Research and Analysis Yolanda.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
AEBG Webinar September 25, Agenda for Today MOE & Consortia allocations update Governance Questions Adult Education Block Grant Reporting Toolkit.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Slide 1 HMO/PSN Quarterly Meeting Performance Improvement Projects September 26, 2011 Presenter: Christi Melendez, RN, CPHQ Project Manager, PIP Review.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
NHDP Plans Quarterly Meeting Performance Improvement Projects September 13, 2011 Presenter: Don Grostic, MS Associate Director, State and Corporate Services.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Principal Investigator ESTCP Selection Meeting
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Principal Investigator ESTCP Selection Meeting
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Plan−Do−Study−Act! Using the PDSA Cycle to Improve Your Performance Improvement Projects March 18, 2014 Presenter: Christi Melendez, RN, CPHQ Associate.
Summary of PMHP Performance Improvement Projects Validation Findings
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Measuring Data Quality and Compilation of Metadata
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Summary of NHDP Performance Improvement Projects Validation Findings
Presenter: Kate Bell, MA PIP Reviewer
A Guide to the Sharing Information on Progress (SIP)
Site (e.g., LARC Embakasi)
Institutional Self Evaluation Report Team Training
Presentation transcript:

WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking. Thank you.

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid HMO/PSNs August 22, 2007 Christi Melendez, RN PIP Review Team Project Leader

Presentation Outline Purpose Review of PIP Activities II, IV, V, VI,VII,VIII and IX Review PIP submission process for the 2007-2008 validation cycle Questions and Answers

PURPOSE To provide technical assistance with examples for Activities receiving an overall score of Partially Met or Not Met for the 2006-2007 validation cycle. PIP submission process for the 2007-2008 validation cycle.

Activity Two: The Study Question HSAG Evaluation Criteria The study question stated the problem to be studied in simple terms. Was answerable. *In general, the question should illustrate the point of: Does doing X result in Y?

Activity Two: Study Question Examples: Do targeted interventions increase the rate of annual retinal (dilated) eye exams for members with diabetes mellitus? Can interventions with members and providers increase blood lead testing rates? Do targeted interventions improve coordination of care between XYZ health plan and ABC mental health providers for members with ADHD?

Activity Four: The Study Population HSAG Evaluation Criteria Was accurately and completely defined. Included requirements for length of a members enrollment in the Health Plan. Captured all members to whom the study question applies. Included ICD-9 codes and procedure codes (if applicable).

Activity Four: The Study Population Example: 100 percent of the eligible plan members are included in this study. The eligible population is defined as Medicaid members ages 18-75 years of age as of December 31st of the measurement year. Continuous enrollment for the entire measurement year was applied. No more than one gap in enrollment of up to 45 days during the measurement year was allowed. Codes: V72.0, CPT code 92287, 67028, 67038-67040

Activity Five: Sampling Techniques HSAG Evaluation Criteria The true or estimated frequency of occurrence was provided and considered in the sampling technique. Sample size was specified. Confidence level was specified. Acceptable margin of error was specified.

Activity Five: Sampling Techniques HSAG Evaluation Criteria (cont.) The sampling technique ensured a representative sample of the eligible population. Sampling techniques were in accordance with generally accepted principles of research design and statistical analysis. Valid sampling techniques should be used for all study indicators, which can be replicated by using the reported sampling parameters.

Study Implementation Phase

Activity Five: Sampling Technique EXAMPLE

Activity Six: Data Collection HSAG Evaluation Criteria Data elements collected were clearly identified. The data sources were clearly identified. A systematic method for data collection was outlined in the PIP documentation. The timeline included both starting and ending dates for all measurement periods.

Activity Six: Data Collection For Manual Data Collection: The relevant education, experience, and training of all manual data collection staff were described in the PIP text. The manual data collection tool was included with the PIP submission. A discussion of the interrater reliability process was in the PIP text.

Activity Six: Data Collection HSAG Evaluation Criteria (cont.) Written instructions for the manual data collection tool was clearly and succinctly written and included in the PIP documentation. A brief statement about the purpose of the study (overview) was included in the written instructions for the manual data collection tool.

Activity Six: Data Collection For Administrative Data Collection: Documentation included a systematic process of the steps used to collect data. This can be defined in narrative format or with algorithms/ flow charts. The estimated degree of administrative data completeness was included along with an explanation of how the percentage of completeness was calculated.

Activity Six: Data Collection

Activity Six: Data Collection

Activity Seven: Improvement Strategies HSAG Evaluation Criteria A completed causal/barrier analysis explanation of how the intervention(s) were related to causes/barriers identified through data analysis and quality improvement processes was included in the PIP documentation. System interventions that will have a permanent effect on the outcomes of the PIP were documented in the text.

Activity Seven: Improvement Strategies HSAG Evaluation Criteria (cont.) If repeat measures did not yield statistically significant improvements, there should be an explanation of how problem solving and data analysis was performed to identify possible causes. If quality improvement interventions were successful, it should be documented that the interventions were standardized and the interventions were monitored.

How to perform a Causal/Barrier Analysis Determine why an event or condition occurs? What’s the problem? Define what the problem is and why it’s a concern. Determine the significance of the problem. Look at data and see how the problem impacts your members and/or health plan. Identify the causes/barriers? Conduct analysis of chart review data; surveys; focus groups. Brainstorming at quality improvement committee meeting. Literature review. Develop/implement interventions based on barriers identified.

Causal/Barrier Analysis Methods and Tools Quality improvement committee Develop an internal task force Tools: Fishbone Process mapping Barrier/intervention table

Activity Seven: Improvement Strategies Example: Fishbone Diagram

Barrier/Intervention Table EXAMPLE Interventions Taken for Improvement as a Result of Analysis. List chronologically the interventions that have had the most impact on improving the measure. Describe only the interventions and provide quantitative details whenever possible (e.g., “hired 4 customer service reps” as opposed to “hired customer service reps”). Do not include intervention planning activities. Date Implemented Check if Ongoing Interventions Barriers that Interventions Address September 2004 X Member education (newsletter/article) regarding the importance of getting an annual retinal eye exam. Members are not having an annual retinal eye exam. October 2004 Provider education( on-site training) regarding importance of members having an annual retinal eye exam.

Activity Eight: Data Analysis and Interpretation of Study Results HSAG Evaluation Criteria The data analysis: Was conducted according to the data analysis plan in the study design. Allowed for generalization of the results to the study population if a sample was selected. Identified factors that threaten internal or external validity of findings (change in demographic population, acquiring another health plan’s members, change in the IS system, change in health plan staff).

Activity Eight: Data Analysis and Interpretation of Study Results HSAG Evaluation Criteria (cont.) Included an interpretation of findings Was presented in a way that provides accurate, clear, and easily understood information. Identified initial measurement and remeasurement of study indicators. Identified statistical differences between initial measurement and remeasurement.

Activity Eight: Data Analysis and Interpretation of Study Results HSAG Evaluation Criteria (cont.) Identified factors that affect the ability to compare initial measurement with remeasurement (changes to the methodology, change in time periods, seasonality, or a change in vendors). Included the extent to which the study was successful.

Activity Eight: Data Analysis and Interpretation of Study Results Example: Baseline Interpretation A baseline rate of 14.1 percent of members received a retinal eye exam for the baseline time period of 7/1/03-6/30/04.

Activity Eight: Data Analysis and Interpretation of Study Results Example: Remeasurement 1 The baseline rate of members receiving a retinal eye exam at 14.1 percent increased to 21.7 percent in the first remeasurement (7/1/04 -6/30/05). This represents a statistically significant (p = 0.00167) increase.

Activity Eight: Data Analysis and Interpretation of Study Results Example: Remeasurement 2 The first remeasurement rate increased from 21.7 percent to 27.6 percent in the second remeasurement (7/1/05 – 6/30/06). This increase was not a statistically significant increase (p = 0.0699).

Activity Eight: Data Analysis and Interpretation of Study Results Overall Analysis: The rate of retinal exams increased each year with statistically significant results in the first remeasurement with no decline in performance. The study has been successful in increasing the rate of retinal exams and will be continued until the 75% baseline goal is met.

Activity Nine: Assessing For Real Improvement HSAG Evaluation Criteria The use of the same methodology for baseline and remeasurements was documented. If there was a change in methodology, the issue was discussed in the PIP text that justified the needed changes. Documentation was included how intervention(s) were successful in affecting system wide processes or health care outcomes.

Activity Nine: Assessing For Real Improvement HSAG Evaluation Criteria (cont.) The improvement in performance as a result of the intervention(s) was documented in the text of the PIP. The PIP documentation included calculations and reported on the degree to which the intervention(s) were statistically significant. The table in Activity IX was completely filled out for each measurement period. The actual p values were documented and whether or not the value was statistically significant.

Activity Nine: Assessing For Real Improvement Example: Completed Table

New PIP submissions New PIPs were not submitted for the 2006-2007 validation cycle. For new PIP submissions, it is important to contact HSAG to obtain the most current updated PIP Summary Form.

How to Submit Continuing PIPs On-going PIPs (submitted to HSAG for the 2006-2007 validation cycle). Highlight, bold, or add text in a different color, and date any new information that is added to the existing PIP Summary Form. Strikethrough and date any information that no longer applies to the PIP study. Ensure all Partially Met and Not Met evaluation elements from the previous validation cycle have been addressed in the documentation.

Resources Frequently asked questions (FAQs) and PIP information - myfloridaeqro.com NCQA Quality Profiles - http://www.qualityprofiles.org/index.asp Institute for Healthcare Improvement – www.ihi.org Center for Healthcare Strategies – www.chcs.org Health Care Quality Improvement Studies in Managed Care Settings – A Guide for State Medicaid Agencies www.ncqa.org/publications National Guideline Clearinghouse – www.guidelines.gov Agency for Healthcare Research and Quality – www.ahrq.gov

Deliverables September 7th: HMO/PSNs notified electronically of submission date with instructions October 5th: Submit PIP studies to HSAG * HSAG will be validating two PIPs per HMO/PSN; one clinical and one nonclinical. If the collaborative PIP is clinical, the other PIP chosen for validation will be nonclinical.

PIP Tips Complete the demographic page before submission. Notify HSAG when the PIP documents are uploaded to the secure ftp site and state the number of documents uploaded. 3. Label ALL attachments and reference them in the body of the PIP study. 4. HSAG does not require personal health information to be submitted. Submit only aggregate results. 5. Document, document, and document!! Go to myfloridaeqro.com for FAQ or contact Cheryl Neel at cneel@hsag.com to answer any questions.

For questions contact: Cheryl Neel HSAG Contacts For questions contact: Cheryl Neel cneel@hsag.com 602.745.6201 Denise Driscoll ddriscoll@hsag.com 602.745.6260

Questions and Answers