Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, 2007 10:30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,

Similar presentations


Presentation on theme: "Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, 2007 10:30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,"— Presentation transcript:

1 Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, 2007 10:30 a.m. – 12:15 p.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation

2 Presentation Outline PIP Overall Comments Aggregate MCO PIP Findings Aggregate PMHP Specific Findings Technical Assistance with Group Activities –Study Design –Study Implementation –Quality Outcomes Achieved Questions and Answers

3 Key PIP Strategies 1.Conduct outcome-oriented projects 2.Achieve demonstrable improvement 3.Sustain improvement 4.Correct systemic problems

4 Validity and Reliability of PIP Results Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results: –Met = Confidence/High confidence in reported PIP results –Partially Met = Low confidence in reported PIP results –Not Met = Reported PIP results not credible

5 Summary of PIP Validation Scores

6 Proportion of PIPs Meeting the Requirements for Each Activity

7 Aggregate Valid Percent Met IIIIIIIV V VIVIIVIIIIXX

8 PMHP Specific Findings 8 PIPs submitted Scores ranged from 37% to 89% Average score was 77% Assessed evaluation elements were scored as Met 78% of the time

9 Summary of PMHP Validation Scores

10 Study Design Four Components: 1.Activity I. Selecting an Appropriate Study Topic 2.Activity II. Presenting Clearly Defined, Answerable Study Question(s) 3.Activity III. Documenting Clearly Defined Study Indicator(s) 4.Activity IV. Stating a Correctly Identified Study Population

11 Activity I. Selecting an Appropriate Study Topic - PMHP Overall Score

12 Activity I. Selecting an Appropriate Study Topic Results: 92 percent of the six evaluation elements were Met 8 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed

13 Activity I: Review the Selected Study Topic HSAG Evaluation Elements: Reflects high-volume or high-risk conditions (or was selected by the State). Is selected following collection and analysis of data (or was selected by the State). Addresses a broad spectrum of care and services (or was selected by the State). Includes all eligible populations that meet the study criteria. Does not exclude members with special health care needs. Has the potential to affect member health, functional status, or satisfaction. Bolded evaluation elements show areas for improvement

14 Activity II. Presenting Clearly Defined, Answerable Study Question(s) - PMHP Overall Score No PIP studies received a Met for both evaluation elements

15 Activity II. Presenting Clearly Defined, Answerable Study Question(s) Results: 0 percent of the two evaluation elements were Met 100 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed

16 Activity II: Review the Study Question(s) HSAG Evaluation Elements: States the problem to be studied in simple terms. Is answerable. Bolded evaluation elements show areas for improvement

17 Activity III. Documenting Clearly Defined Study Indicator(s) - PMHP Overall Score

18 Activity III. Documenting Clearly Defined Study Indicator(s) Results: 59 percent of the seven evaluation elements were Met 21 percent were Partially Met or Not Met 20 percent of the evaluation elements were Not Applicable or Not Assessed

19 Activity III: Review Selected Study Indicator(s) HSAG Evaluation Elements: Is well defined, objective, and measurable. Are based on practice guidelines, with sources identified. Allows for the study question to be answered. Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives. Have available data that can be collected on each indicator. Are nationally recognized measure such as HEDIS ®, when appropriate. Includes the basis on which each indicator was adopted, if internally developed. Bolded evaluation elements show areas for improvement

20 Activity IV. Stating a Correctly Identified Study Population - PMHP Overall Score

21 Activity IV. Stating a Correctly Identified Study Population Results: 58 percent of the three evaluation elements were Met 13 percent were Partially Met or Not Met 29 percent of the evaluation elements were Not Applicable or Not Assessed

22 Activity IV: Review the Identified Study Population HSAG Evaluation Elements: Is accurately and completely defined. Includes requirements for the length of a member’s enrollment in the managed care plan. Captures all members to whom the study question applies. Bolded evaluation elements show areas for improvement

23 Group Activity

24 Study Implementation Three Components: 1.Activity V. Valid Sampling Techniques 2.Activity VI. Accurate/Complete Data Collection 3.Activity VII. Appropriate Improvement Strategies

25 Activity V. Presenting a Valid Sampling Technique - PMHP Overall Score

26 Activity V. Presenting a Valid Sampling Technique Results: 3 out of the 8 PIP studies used sampling. 38 percent of the six evaluation elements were Met. 0 percent were Partially Met or Not Met. 63 percent of the evaluation elements were Not Applicable or Not Assessed.

27 Activity V: Review Sampling Methods * This section is only validated if sampling is used. HSAG Evaluation Elements : Consider and specify the true or estimated frequency of occurrence. (N=8) Identify the sample size. (N=8) Specify the confidence level to be used. (N=8) Specify the acceptable margin of error. (N=8) Ensure a representative sample of the eligible population. (N=8) Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. (N=8)

28 Populations or Samples? Generally, –Administrative data uses populations –Hybrid (chart abstraction) method uses samples identified through administrative data

29 Activity VI. Specifying Accurate/Complete Data Collection - PMHP Overall Score

30 Activity VI. Specifying Accurate/Complete Data Collection Results: 55 percent of the eleven evaluation elements were Met 10 percent were Partially Met or Not Met 35 percent of the evaluation elements were Not Applicable or Not Assessed

31 Activity VI: Review Data Collection Procedures HSAG Evaluation Elements: Clearly defined data elements to be collected. (55 percent Met) Clearly identified sources of data. (77 percent Met) A clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected. (62 percent Met) A timeline for the collection of baseline and remeasurement data. (57 percent Met) Qualified staff and personnel to collect manual data. (13 percent Met; 77 percent N/A) A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications. (13 percent Met; 77 percent N/A) Bolded evaluation elements show areas for improvement

32 Activity VI: Review Data Collection Procedures (cont.) HSAG Evaluation Elements: A manual data collection tool that supports interrater reliability. (13 percent Met; 77 percent N/A) Clear and concise written instructions for completing the manual data collection tool. (13 percent Met; 77 percent N/A) An overview of the study in the written instructions. Administrative data collection algorithms that show steps in the production of indicators An estimated degree of automated data completeness (important if using the administrative method). Bolded evaluation elements show areas for improvement

33 Where do we look for our sources of data?

34 Baseline Data Sources Medical Records Administrative claims/encounter data Hybrid HEDIS Survey Data MCO program data Other

35 Activity VII. Documenting the Appropriate Improvement Strategies - PMHP Overall Score

36 Activity VII. Documenting the Appropriate Improvement Strategies Results: 44 percent of the four evaluation elements were Met 3 percent were Partially Met or Not Met 53 percent of the evaluation elements were Not Applicable or Not Assessed

37 Activity Seven: Assess Improvement Strategies HSAG Evaluation Elements: Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes. System changes that are likely to induce permanent change. Revised if original interventions are not successful. Standardized and monitored if interventions are successful. Bolded evaluation elements show areas for improvement

38 Determining Interventions Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population?

39 First Do A Barrier Analysis What did an analysis of baseline results show ? How can we relate it to system improvement? Opportunities for improvement Determine intervention(s) Identify barriers to reaching improvement

40 How was intervention(s) chosen? By reviewing the literature –Evidence-based –Pros & cons –Benefits & costs Develop list of potential interventions -- what is most effective?

41 Types of Interventions Education Provider performance feedback Reminders & tracking systems Organizational changes Community level interventions Mass media

42 Choosing Interventions Balance –potential for success with ease of use –acceptability to providers & collaborators –cost considerations (direct and indirect) Feasibility –adequate resources –adequate staff and training to ensure a sustainable effort

43 Physician Interventions: Multifaceted Most Effective Most effective: –real-time reminders –outreach/detailing –opinion leaders –provider profiles Less effective: –educational materials (alone) –formal CME program without enabling or reinforcing strategies

44 Patient Interventions Educational programs –Disease-specific education booklets –Lists of questions to ask your physician –Organizing materials: flowsheets, charts, reminder cards –Screening instruments to detect complications –Direct mailing, media ads, websites

45 Evaluating Interventions Does it target a specific quality indicator? Is it aimed at appropriate stakeholders? Is it directed at a specific process/outcome of care or service? Did the intervention begin after baseline measurement period?

46 Interventions Checklist 3Analyze barriers (root causes) 3Choose & understand target audience 3Select interventions based on cost-benefit 3Track intermediate results 3Evaluate effectiveness 3Modify interventions as needed 3Re-Measure

47 Group Activity

48 Quality Outcomes Achieved Three Components: 1.Activity VIII. Presentation of Sufficient Data Analysis and Interpretation 2.Activity IX. Evidence of Real Improvement Achieved 3.Activity X. Data Supporting Sustained Improvement Achieved

49 Activity VIII. Presentation of Sufficient Data Analysis and Interpretation - PMHP Overall Score

50 Activity VIII. Presentation of Sufficient Data Analysis and Interpretation Results: 14 percent of the nine evaluation elements were Met 8 percent of the evaluation elements Partially Met or Not Met 78 percent of the evaluation elements were Not Applicable or Not Assessed

51 Activity VIII: Review Data Analysis and Interpretation of Study Results HSAG Evaluation Elements: Is conducted according to the data analysis plan in the study design. Allows for generalization of the results to the study population if a sample was selected. Identifies factors that threaten internal or external validity of findings. Includes an interpretation of findings. Is presented in a way that provides accurate, clear, and easily understood information. Bolded evaluation elements show areas for improvement

52 Activity VIII: Review Data Analysis and Interpretation of Study Results (cont.) HSAG Evaluation Elements: Identifies initial measurement and remeasurement of study indicators. Identifies statistical differences between initial measurement and remeasurement. Identifies factors that affect the ability to compare initial measurement with remeasurement. Includes the extent to which the study was successful. Bolded evaluation elements show areas for improvement

53 Changes in Study Design? Study design should be same as baseline Data source Data collection methods Data analysis Target population or sample size Sampling methodology If change: rationale must be specified & appropriate rationale must be specified & appropriate

54 Activity IX. Evidence of Real Improvement - PMHP Overall Score

55 Activity IX. Evidence of Real Improvement Results: 16 percent of the four evaluation elements were Met 9 percent were Partially Met or Not Met 75 percent of the evaluation elements were Not Applicable or Not Assessed

56 Activity IX: Assess the Likelihood that Reported Improvement is “Real” Improvement HSAG Evaluation Elements: The remeasurement methodology is the same as the baseline methodology. There is documented improvement in processes or outcomes of care. The improvement appears to be the result of intervention(s). There is statistical evidence that observed improvement is true improvement. Bolded evaluation elements show areas for improvement

57 Statistical Significance Testing Time Periods Measurement Periods NumeratorDenominatorRate or Results Industry Benchmark Statistical Testing and Significance CY 2003Baseline20141148.9%60%N/A CY 2004Re-measurement 122541154.7%60%Chi-square = 2.8 P-value = 0.09387 NOT SIGNIFICANT AT THE 95% CONFIDENCE LEVEL

58 Activity X. Data Supporting Sustained Improvement Achieved - PMHP Overall Score No PMHP reached this Activity

59 Activity X. Data Supporting Sustained Improvement Achieved Results: 0 percent of the one evaluation element was Met 0 percent was Partially Met or Not Met 100 percent of the evaluation element was Not Applicable or Not Assessed

60 Activity X: Assess Whether Improvement is Sustained HSAG Evaluation Elements: Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant. Bolded evaluation elements show areas for improvement

61 Quality Outcomes Achieved Baseline 1st Yr Demonstrable Improvement Sustained Improvement

62 Modifications in interventions Changes in study design Improvement sustained for 1 year

63 HSAG Contact Information Cheryl Neel, RN, MPH,CPHQ Manager, Performance Improvement Projects cneel@hsag.com 602.745.6201 Denise Driscoll Administrative Assistant ddriscoll@hsag.com 602.745.6260

64 Questions and Answers


Download ppt "Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, 2007 10:30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,"

Similar presentations


Ads by Google