Presentation is loading. Please wait.

Presentation is loading. Please wait.

Funded by HRSA HIV/AIDS Bureau Welcome to Day 2 Module 12 – Day 2 8:00am – 9:00am (60 min)

Similar presentations


Presentation on theme: "Funded by HRSA HIV/AIDS Bureau Welcome to Day 2 Module 12 – Day 2 8:00am – 9:00am (60 min)"— Presentation transcript:

1 Funded by HRSA HIV/AIDS Bureau Welcome to Day 2 Module 12 – Day 2 8:00am – 9:00am (60 min)

2 National Quality Center (NQC)2 TOT Learning Objectives Understand quality improvement principles and their application to HIV care Design appropriate and helpful adult learning experiences that measurably advance quality improvement Improve confidence in ability to facilitate quality improvement-related training opportunities

3 National Quality Center (NQC)3 TOT Learning Objectives (cont.) Effectively plan the meeting logistics for upcoming workshops Understand how to routinely report to NQC on training activities, training participants and evaluation results Access resources dedicated to successful TOT participants

4 National Quality Center (NQC)4 Agenda – Day 2 8:00Welcome & Warm-up Activity 9:00 Using Assessment Five Step Model for Creating Effective Training 11:00 Performance Measurement 12:30Box Lunch and Video 1:30 QI Resources Safari 2:15Individual Training Design and Practice 4:30Experiential Learning: Paper Puppets 5:15‘Aha Moments’ & Brief TOT Evaluation 6:00 Adjourn

5 Funded by HRSA HIV/AIDS Bureau Fears and Challenges Discussion

6 National Quality Center (NQC)6 Activity: Using Flowcharts Effectively

7 National Quality Center (NQC)7 Flowchart Sample

8 National Quality Center (NQC)8 Flowchart Symbols

9 National Quality Center (NQC)9 Flowchart: The Best Pizza in the World Decide on start and end point Brainstorm all the steps in the process Write a single step on a single post-it note Arrange the post-it notes in order

10 Funded by HRSA HIV/AIDS Bureau Using Assessment to Target Training at the Individual Learner Needs Module 9 – Day 1 4:30pm – 5:00pm (30 min)

11 National Quality Center (NQC)11 Learning Objectives Participants will be able to identify and use different data-based methods for understanding adult learner needs

12 National Quality Center (NQC)12 Learner-Centered Training: Key Principles Experience: “Adult learners come to each learning event with their unique former knowledge. Adult learners possess a great deal more experience than children do. Some of it facilitates learning, but it may also act as an inhibitor. Adult learners learn if the training is pitched at their level and type of experience.” From: Telling Ain’t Training

13 National Quality Center (NQC)13 Sample of Survey Monkey Data Sample Assessment Statements: I understand the value of learning through small, incremental changes to achieve improvement. I can help others implement quality improvement activities. I can use data to evaluate the performance of processes or systems. I can develop effective learning materials for training. I can use instructional media effectively in training sessions (e.g. PowerPoint).

14 National Quality Center (NQC)14 Types of Learning Needs Assessments Personal Discussion or email exchange with participant Work Samples from site visit Feedback from participant’s manager, colleagues or patients Focus Groups of prospective participants Pre-training written or online self-assessment

15 National Quality Center (NQC)15 Who Is In the Room?

16 National Quality Center (NQC)16

17 Funded by HRSA HIV/AIDS Bureau Assessment and the Five Step Model for Creating Effective Training Module 13 – Day 2 9:00am – 10:30am (90 min)

18 National Quality Center (NQC)18 Adult Education Development Design Delivery

19 Funded by HRSA HIV/AIDS Bureau Assessment Who is out there? And what do they need?

20 National Quality Center (NQC)20 Session Plan: A Universal Five Step Model that Gets Great Results Using this simple five step model to design your training program will help your learners learn and grow. Rationale: Why? Objectives: What? Activities: How? Evaluation: How Well? Corrective Feedback Confirming Feedback OK? Source: Telling Ain’t Training, ASTD, Harold Stolovitch, Erica Keeps, 2006

21 National Quality Center (NQC)21 Training Development Model: Read and Discuss STEP 1: Topic & Rationale (page110) STEP 2: Performance Objectives (page110) STEP 3: Activities (page110) STEP 4: Evaluation (page 111) STEP 5: Feedback (page110)

22 National Quality Center (NQC)22 Own Case Study Activity Individual (30 minutes)  Hand Out Five Step Model sheets  Option A  Option B Pair Share (20 minutes)  Share completed handouts  Get feedback from partner Makes Sense Could Improve

23 National Quality Center (NQC)23 Training Development and Design Guide: Session Plan Session Title: Target Audience: Time Allotted: Rationale: Objectives: Activities: Evaluation: Feedback: Source: Telling Ain’t Training, ASTD, Harold Stolovitch, Erica Keeps, 2006

24 National Quality Center (NQC)24 Group Debrief If and how did the template help you develop your training? What type of feedback did you give? Receive? What further thoughts do you have on your first “TOT Workshop” and your potential participants? What questions or concerns do you have with this initial structure for developing training?

25 Funded by HRSA HIV/AIDS Bureau BREAK QI Safari Presentations talk with Faculty

26 National Quality Center (NQC)26 Performance Measurement: Overview & Group Exercises Module 14 – Day 2 11:00am – 12:30pm (90 min)

27 National Quality Center (NQC)27 Learning Objectives Discuss common language of Improvement Apply HAB performance measures to identify areas for improvement in your clinic/program. Find available resources around performance measurement and benchmarking in HIV care Share with others lessons learned around performance measurement

28 National Quality Center (NQC)28 Overall Goal The best care we know how to give, for every patient, at every site, every day.

29 Funded by HRSA HIV/AIDS Bureau Improvement Language Definitions, Paradigms and Variation

30 National Quality Center (NQC)30

31 National Quality Center (NQC)31 Data for Improvement, Accountability and Research in Health Care ADAPTED BY HIVQUAL INTERNATIONAL: Solberg ADAPTED BY HIVQUAL INTERNATIONAL: Solberg “The Three Faces of Performance Measurement: Improvement, Accountability and Research.” Journal on Quality Improvement. V 23 1997.

32 National Quality Center (NQC)32 Data for Improvement, Accountability and Research in Health Care: Added by HIV QUAL International

33 National Quality Center (NQC)33 Variation exists…… (C) Virginia L H Crowe

34 Funded by HRSA HIV/AIDS Bureau Variation The Voice of the System

35 National Quality Center (NQC)35 We listen with our eyes …..to the voice of the system

36 National Quality Center (NQC)36 Display “Tracking a few key measures over time is the single most powerful tool a team can use.” Attributed to T Nolan, PhD

37 National Quality Center (NQC)37 Cycle Time for Practices 1, 2 & 3: “Getting in to see the doctor” Practice 1 Practice 2 Practice 3

38 Funded by HRSA HIV/AIDS Bureau HAB/OPR Performance Measures Ying and Yang: Performance Measurement - Quality Improvement

39 National Quality Center (NQC)39 Balance between performance measurement and quality improvement activities

40 National Quality Center (NQC)40 What we want to avoid…….. Quality Management Program

41 National Quality Center (NQC)41 Development of HAB Clinical Performance Measures Underlying assumption: The development and use of performance measures is key to a solid QM program Internal HAB working group identified potential indicators Recommendations from IOM report “Measuring What Matters” used as a framework for developing measures Significant input from key stakeholders was obtained

42 National Quality Center (NQC)42 What is a quality indicator? A quality indicator is tool to measure specific aspects of care and services that are optimally linked to better health outcomes while being consistent with current professional knowledge and meeting client needs.

43 National Quality Center (NQC)43 What makes a good indicator? Relevance  Does the indicator affect a lot of people or programs?  Does the indicator have a great impact on the programs or patients/clients in your EMA, State, network or clinic? Measurability  Can the indicator realistically and efficiently be measured given finite resources?

44 National Quality Center (NQC)44 What makes a good indicator? (cont’d.) Accuracy/Validity  Is the indicator based on accepted guidelines or developed through formal group-decision making methods? Improvability  Can the performance rate associated with the indicator realistically be improved given the limitations of your services and population?

45 National Quality Center (NQC)45 HAB Clinical Performance Measures Measures focus on clinical services provided to adults & adolescents  Total of 28 measures were included in the original set Detail sheets outlined specific information related to each measure Calendar year is the measurement period http://hab.hrsa.gov/special/habmeasures.htm

46 National Quality Center (NQC)46 HAB HIV/AIDS Core Clinical Performance Measures Group 1: can serve as a foundation on which to build, especially if a clinical program has no performance measures. Group 2: are important measures for a robust clinical management program and should be seriously considered. Group 3: considered "best practice," but lack written clinical guidelines or are difficult to collect. Google: HAB measures

47 National Quality Center (NQC)47 HAB Clinical Performance Measures: Group 1 Represents the 5 core clinical performance measures deemed critical for HIV programs to monitor. Indicators*:  ARV Therapy for Pregnant Women  CD4 T-cell Count  HAART  Medical Visits  PCP Prophylaxis *There is no hierarchy in importance within the Group Released 12/07

48 National Quality Center (NQC)48 Group 1 Clinical Indicators ARV Therapy For Pregnant Women: % of pregnant women with HIV infection who are prescribed antiretroviral therapy CD4 T-Cell Count: % of clients who had 2 or more CD4 counts at least 3 months apart HAART: % of clients with AIDS who are prescribed HAART Medical Visits: % of clients who had two or more medical visits in an HIV care setting PCP Prophylaxis: % of clients with CD4 count <200 cells/mm 3 who were prescribed PCP prophylaxis

49 National Quality Center (NQC)49 HAB Clinical Performance Measures: Group 2 Group 2 measures that reflect important aspects of care that impact HIV-related morbidity & focus on treatment decisions that affect a sizable population Adherence Assessment & Counseling Cervical Cancer Screening Hepatitis B Vaccination Hepatitis C Screening *There is no hierarchy in importance within the Group Released 8/08 HIV Risk Counseling Lipid Screening Oral Exam Syphilis Screening TB Screening

50 National Quality Center (NQC)50 HAB Clinical Performance Measures: Group 3 Group 3 measures represent areas of care that are considered "best practice," but may lack written clinical guidelines or rely on data that are difficult to collect. Chlamydia Screening Gonorrhea Screening Hepatitis B Screening Hepatitis/HIV Alcohol Counseling Influenza Vaccination MAC Prophylaxis *There is no hierarchy in importance within the Group Released in Summer 2009 Mental Health Screen Pneumococcal Vaccination Substance Abuse Screen Tobacco Cessation Counseling Toxoplasma Screening

51 National Quality Center (NQC)51 Other HAB Measures Medical Case Management: care plans and medical visits Oral Health Performance Measures: dental and medical history, treatment plan, oral health ed, periodontal screening/exam, treatment plan completion ADAP Performance Measures: eligibility recertification, app determination, formulary, inappropriate ARV

52 National Quality Center (NQC)52 HAB Clinical Performance Measures Grantees are encouraged to include the core clinical performance measures in their quality management program  Grantees are not required to submit performance measurement data to HAB FAQs developed as a companion guide.  http://hab.hrsa.gov/special/habmeasures.htm

53 National Quality Center (NQC)53 How are you using the HAB/OPR measures in your quality program?

54 National Quality Center (NQC)54 Question? When does it make sense to use indicators in addition to or different than the HAB indicators?

55 National Quality Center (NQC)55 NQC Publication ‘Measuring Clinical Performance’ is a guide for HIV providers to learn more about indicator development and data collection Download @ http://www.nationalqualitycenter.org/index.cfm/35778/index.cfm/ 22/13908 http://www.nationalqualitycenter.org/index.cfm/35778/index.cfm/ 22/13908

56 National Quality Center (NQC)56 eHIVQUAL Performance Data Part C and Part D Programs (Sample) CD4 Count last 6 months Viral Load last 6 monthsTB Testing Pelvic Exam Gonorrhea Screening Chlamydia Screening Syphilis Screening HAART CD4<350 HAART CD4<500 88.5%86.1%67.0%70.6%62.5%62.8%79.1%87.8%83.0% Hepatitis C Screening Dental Exam Tobacco Use Substance Use Screening PCP Prophylaxi s Viral Load Suppressi on Mental Health Screening Colonosco py 50+ yrs. old 87.2%43.4%79.8%76.9%84.7%59.7%33.1%11.9%

57 Funded by HRSA HIV/AIDS Bureau Got Data Now What? Using Data for Quality Improvement

58 National Quality Center (NQC)58 Got Data: Now What? Analyze (understand the data) Prioritize areas for action Communicate with stakeholders Take Action

59 National Quality Center (NQC)59 First, Look at the Data - What Do They Tell Us? How bad is the problem?  How many?  How often?  How severe? Is the performance stable, or is there a trend?  Getting better?  Getting worse? How does our performance compare to others’? What are the data limitations? Analyzing Data

60 National Quality Center (NQC)60 Data are a Guide, Not a Grade... They tell us what questions to ask. At the level of a network, we can NEVER say that data represents performance, we can only say that the data are leading us to ask questions: “Why is this number so low?” “Is it a data entry problem, a data system/communication problem, or a problem providing the care (or a mix)?” “Why are the numbers so different from this area to that area?” “What are the data telling us about our system?” Only the providers can ANSWER those questions, but we can’t even know what to ask if we aren’t looking at the whole network’s data together.

61 National Quality Center (NQC)61 Kubler Ross Stages of Coping with Data Denial: “The data are wrong….” Anger: “The data are right, but it’s not a problem…” Bargaining: “The data are right, it’s a problem, but it’s not my problem…” Acceptance: “The data are right, it’s a problem, it’s my problem…”

62 National Quality Center (NQC)62 Data Follow-up What immediate changes will you make based on the key findings? Are you considering initiating a QI project to address the data findings? Who will be responsible and what are the next steps?

63 National Quality Center (NQC)63 Options for Actions ‘Do nothing!’ – if results are within expected ranges and goals, frequently repeat measurement ‘Take Immediate Individual Action’ – follow-up on individual pts (missed appointments, pts not on meds, etc) and/or provider ‘Quick PDSA’ – develop a quick pilot test ‘Launch QI Project!’ – set up a cross-functional team to address identified aspects of chronic care

64 National Quality Center (NQC)64 Barriers To Putting Data Into Action Don’t even know where to get data/info Paralysis by analysis No one is interested in it Defensiveness Too complex to understand Incorrect interpretation of data

65 National Quality Center (NQC)65 Group Exercise: Prioritizing Areas for Action In your table group: Review the data report for the State of Euphoria Decide which one area/indicator you will focus on for next steps. Be ready to discuss in the large group what you decided and why.

66 National Quality Center (NQC)66 Data Sharing Did you discuss the data results and analysis in your quality management committee? How will you share the results with your staff and consumers? How do you generate ownership among staff and consumers ? Data Sharing

67 National Quality Center (NQC)67 To Name or Not to Name…. Pro Naming Extra incentive for sites to improve. Consumers prefer to see results by site name. Explanations for results are often more obvious when site name is known Allows identification of high performing sites to share what they know Anti Naming Results reflect a complicated mix of data system issues and clinical performance. Could be misused or misunderstood Can cause provider anger and disengagement

68 National Quality Center (NQC)68 Data Reporting: Benchmarking Concern When using benchmarking data, understand what you are comparing: apples to apples or Numerators to numerators and denominators to denominators Example: Programs frequently compare results of HIVQUAL measures with HAB measure

69 National Quality Center (NQC)69 Data Reporting: Benchmarking Concern Example: Cervical Cancer Screen – Pap Test Using HAB’s annual cervical cancer screen or pap test measure, one Part C/D program achieved 67% of their women receiving an annual pap test in 2010. They presented their data to the HIV QM Committee including a comparison with HIVQUAL’s data report of 70% as the mean. How would you advise this program?

70 National Quality Center (NQC)70 Tips for Sharing Data Involve stakeholders when reports are generated and disseminated Share reports with staff promptly & listen to variance explanations View performance improvement as a management tool (as a guide, not a grade…) Anticipate defensiveness Watch out for paralysis by analysis (“we need even better data before we act…..”) Data Sharing

71 National Quality Center (NQC)71 Create a Plan Decide on a sampling plan (sample size, eligible records, draw a random sample) Develop data collection tools and instructions Train data abstractors Run pilot test (adjust after a few records) Inform other staff of the measurement process Check for data accuracy Remain available for guidance Make a plan for display and distribution of data

72 National Quality Center (NQC)72 Collect “Just enough” Data The goal is to improve care, not prove a new theorem 100% is not needed Maximal power is not needed In most cases, a straightforward sample will do just fine

73 National Quality Center (NQC)73 Establish accountability for data collection Assign a staff person or a team (‘chart lunches’) for the data collection process Establish a timeframe for data collection Assign a representative to report the data reports (and potential data collection barriers) at the next QM Committee meeting

74 National Quality Center (NQC)74 Data collection resource eHIVQUAL Software: a Web-based application, www.ehivqual.org incorporates HIV clinical indicators to measure HIV care (adult, adolescent, pediatric and case management) provides immediate reporting of performance data for use in internal quality programs external reporting for comparisons with other participating HIVQUAL programs if interested, contact your HIVQUAL Consultant or Darryl Ng, Director, HIVQUAL-US, dwn01@health.state.ny.us dwn01@health.state.ny.us

75 National Quality Center (NQC)75 Resources to randomize your records “Measuring Clinical Performance: A Guide for HIV Health Care Providers” (includes random number tables) A useful website for the generation of random numbers is www.randomizer.org www.randomizer.org Common spreadsheet programs, such as MS Excel

76 Funded by HRSA HIV/AIDS Bureau Lunch

77 Funded by HRSA HIV/AIDS Bureau QI Safari Infomercials Module 16 – Day 2 1:30pm – 2:15pm (45 min)

78 National Quality Center (NQC)78 Mini presentation Volunteers “Infomercial” HIVQUAL Workbook Guide to Consumer Involvement QM Plan Checklist “Infomercial” Patient Satisfaction Guide Patient Health Journal Performance Measurement Guide

79 National Quality Center (NQC)79 Quality Improvement Publications

80 National Quality Center (NQC)80 Quality Improvement Publications

81 National Quality Center (NQC)81 Quality Improvement Publications

82 National Quality Center (NQC)82 Quality Improvement Publications

83 National Quality Center (NQC)83 Quality Improvement Publications

84 National Quality Center (NQC)84 Quality Improvement Publications

85 National Quality Center (NQC)85 Quality Improvement Publications

86 National Quality Center (NQC)86 Quality Improvement Publications

87 Funded by HRSA HIV/AIDS Bureau Individual Training Design and Practice Module 17 – Day 2 2:15pm – 4:30pm (135 min)

88 National Quality Center (NQC)88 Objectives Create a high level training outline, following the Five Step Training Guide Use a detailed Day-At-A-Glance form to structure a two-day training workshop Practice using a Faculty Notes Form to structure one or more training modules within a workshop Improve the effectiveness of your planned training through practice and feedback

89 National Quality Center (NQC)89 This module is designed in two parts: A) Design a training module or modules B) Share your training design with peers in small groups to receive feedback Putting It All Together

90 National Quality Center (NQC)90 Use the forms provided to develop your module(s): Use the Five Step Training Guide to get clear on the big picture (TOT. pg. XXXX) Get more detailed using the Faculty Notes Form (TOT. pg. XXX) If you will be planning a multi-module training lasting more than 4 hours, you may explore and practice using the Day-At-A-Glance form (TOT pg. xxx) Putting It All Together

91 Funded by HRSA HIV/AIDS Bureau Afternoon Schedule Work On Own 2:30– 3:30 (break as you work) Pair-up 3:30 – 4:14 (Peer Feedback form) Group Debrief 4:15-4:30 Experiential Learning 4:15 – 5:00 Closing Eval 5:00 – 5:30

92 National Quality Center (NQC)92 Peer Feedback A.From into pairs or small groups w/ faculty guidance B.Divide and monitor the group’s time so that each group member has an equal amount of time in which to share and receive feedback C.Use the written feedback forms (Five Step Model Feedback Sheet – in TOT Guide pg. 136 -137) to capture feedback in their training module (s).

93 National Quality Center (NQC)93 Large Group Discussion What are the challenges in designing your training? How do the forms support your design work? What 1-2 things have you learned from this practice session? What other challenges do you foresee at this stage? What additional support or practice do you need after these modules?

94 Funded by HRSA HIV/AIDS Bureau Module 18 - Day 2 4:30 – 5:15 PM Paper Puppet Exercise (from NQC Game Guide)

95 National Quality Center (NQC)95 Paper Puppet Game

96 National Quality Center (NQC)96 Overview Teams of 6 participants (5 workers and one timekeeper) Instructions on how to construct one paper puppet First round; each team produces as many paper puppets as possible Short group discussion Second round; teams produces puppets within specific time period, facilitator performs quality inspection Short group discussion

97 National Quality Center (NQC)97 How to Construct a Paper Puppet? Step 7: Draw two red eyes and a blue tongue to finish your paper puppet.

98 National Quality Center (NQC)98 First Round Objective: each team has to produce as many paper puppets as possible within 3 minutes Each of the five team members assumes one role in the production of the paper puppet Timekeeper tracks time when each paper puppet is produced Timekeeper also assumes the role of the quality officer (reject paper puppet if not satisfactory) Record time on flipchart paper

99 National Quality Center (NQC)99 Group Discussion What works well? Which step seemed be the bottleneck? Why did the team perform in the way they did? What factors would have contributed to a higher ‘throughput’?

100 National Quality Center (NQC)100 Second Round – Competitive Round Objective: each team has to produce as many paper puppets as possible within 3 minutes Rules:  Each member has to assume a role in producing each paper puppet  Each paper puppet has to meet the requirements of the quality officer Timekeeper tracks time when each paper puppet is produced; record time on flipchart paper The session facilitator will act as the central quality officer and can reject paper puppet if not satisfactory; need to redo unsatisfactory puppets Be creative!

101 National Quality Center (NQC)101 Debriefing What are the lessons learned from this game?  One step in a process affects another - improvement comes from advancing each step in the process  To change one step does not necessarily make a substantial change the entire system – to improve HIV care the system of HIV care needs to be changed  Team members contribute suggestions for change – embrace the diversity of opinions  Team support is needed to assist the process - leaders need to support improvements not resist change What are the implications for your HIV program?

102 Funded by HRSA HIV/AIDS Bureau Sharing of Aha! Moments & Brief TOT Evaluation Module 19 – Day 2 5:15pm – 6:00pm (45 min)

103 National Quality Center (NQC)103 Evaluation Personal Highlights or Aha! Moments Polling/Survey Went well/Do differently

104 National Quality Center (NQC)104 The way the course was delivered today was an effective way for me to learn. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree 10

105 National Quality Center (NQC)105 I had sufficient opportunity to participate today. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree

106 National Quality Center (NQC)106 Materials were useful during the day. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree

107 National Quality Center (NQC)107 The agenda and content for today were logically organized. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree

108 National Quality Center (NQC)108 My knowledge and /or skills increased as a result of today. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree

109 National Quality Center (NQC)109 The workshop had the right balance of lecture and interactive activities. 1.Strongly Disagree 2. 3.Agree 4. 5.Strongly Agree

110 National Quality Center (NQC)110 Overall, I was satisfied with today. 1.No, definitely not 2. 3.Yes, more or less 4. 5.Yes, definitely

111 National Quality Center (NQC)111 How ready are you to plan and facilitate a QI workshop? 1.Not Ready 2. 3.Mostly Ready 4. 5.Very Ready

112 Funded by HRSA HIV/AIDS Bureau Thank You :-)


Download ppt "Funded by HRSA HIV/AIDS Bureau Welcome to Day 2 Module 12 – Day 2 8:00am – 9:00am (60 min)"

Similar presentations


Ads by Google