MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.

Slides:



Advertisements
Similar presentations
DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Advertisements

Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Begin with the End in Mind
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Communicating and Applying Research Results Session 3.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
M&E Framework for Programmes for Most-at-Risk Populations
U SING G EOGRAPHY TO S TRENGTHEN THE D ECISION -M AKING P ROCESS.
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
MEASURE Evaluation DATIM Data Exchange Denise Johnson
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
PLACE Method Priorities for Local AIDS Control Efforts 1 1 MEASURE Evaluation, A Manual for implementing the PLACE Method.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Regional Forum: Use of Gender Data in Sub-national Decision-making Kigali, Rwanda August 2012 Key Gender Terms and Concepts.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Data Use for Gender-Aware Health Programming Session 1: Setting the Gender Policy Context.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Session: 8 Disseminating Results
Introduction MODULE 6: RHIS Data Demand and Use
The PLACE Mapping Tool Becky Wilkes, MS, GISP Marc Peterson, MA, GISP
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
Measuring Success Toolkit
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction to Comprehensive Evaluation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Assessment Training Session 9: Assessment Analysis
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Information Systems for Health:
Information Systems for Health:
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Data and Interoperability:
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
EHRs and Privacy Protection in LMICs
Presentation transcript:

MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop

Topics to Cover  How to link the results with actions for improvement  How to compile the final action plan using the RDQA tool template

The RDQA Process Data Verification System Assessment Interpret the Output Develop Action Plans Disseminate Results Ongoing Monitoring & Follow up

Linking Results to Action When the RDQA Tool is filled out electronically, it uses the information entered to produce dashboards for each level and site, as well as dashboards that aggregate the results from all levels and sites included in the assessment. At each site, the team will draft recommendations for the site based on the assessment results The recommendations from each site are summarized in the action plan generated by the tool

Recommendation Templates (Service Delivery Point & Intermediate Aggregate) Templates are provided in each delivery site and intermediate aggregate worksheet to summarize recommendations for the sites based on the results of the assessment

Compiling the Final Action Plan The final output of the RDQA is an action plan for improving data quality Action Plan is based on the findings and recommendations for each site and for the Programme as a whole, an overall action plan should be developed and discussed with the Programme manager(s) and relevant M&E staff

Compiling the Final Action Plan  Decisions on where to invest resources for system strengthening should be based on:  the relative strengths and weaknesses of the different functional areas of the reporting system identified by the RDQA  consideration of problem magnitude, feasibility, cost, resources needed, and capacity  Include the following  identified strengthening measures  the staff responsible  the timeline for completion  resources required  follow-up

Discussion What criteria could you use to determine which action points to include in your list of recommendations at the site?

Action Plan at the National M&E Unit  The recommendations template should be filled in at the end of a site visit in collaboration with site staff, taking into account the findings  The National M&E Action Plan should include directions to summarize key issues that it should follow-up at various levels of the system

Action Plan Templates

Prioritizing Potential Action Points  What is the magnitude of the problem? (small/big)  How feasible is the solution? (not feasible/feasible)  How much will it cost? (low/high cost)  What other resources are needed? (minimal/substantial)  What capacity exists to implement the solution? (little capacity/excellent capacity)

Questions & Answers

MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide.