Session: 6 Understanding & Using the RDQA Tool Output

Slides:



Advertisements
Similar presentations
DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Advertisements

Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Begin with the End in Mind
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
U SING G EOGRAPHY TO S TRENGTHEN THE D ECISION -M AKING P ROCESS.
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Regional Forum: Use of Gender Data in Sub-national Decision-making Kigali, Rwanda August 2012 Key Gender Terms and Concepts.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Data Use for Gender-Aware Health Programming Session 1: Setting the Gender Policy Context.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Management of RHIS Resources
Monitoring and Evaluation (M&E) Plan
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Difference-in-Differences Models
Ensuring Data Quality for Monitoring and Evaluation
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Session: 8 Disseminating Results
Fundamentals of Monitoring and Evaluation
The PLACE Mapping Tool Becky Wilkes, MS, GISP Marc Peterson, MA, GISP
ROUTINE HEALTH INFORMATION SYSTEMS
Measuring Success Toolkit
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction to Comprehensive Evaluation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Complementing Routine Data with Qualitative Data for Decision Making: Understanding the "Why" Behind Program Data Day 1 - Session 1 Note to Facilitator:
Assessment Implementation
Assessment Training Session 9: Assessment Analysis
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Conceptual Introduction to the RDQA
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Information Systems for Health:
Introduction to Health Informatics:
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Data and Interoperability:
Use of Information for Decision Making
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
Presentation transcript:

Session: 6 Understanding & Using the RDQA Tool Output MEASURE EVALUATION Data Quality Assurance Workshop Session: 6 Understanding & Using the RDQA Tool Output

Topics to Cover Charts and graphs overview How to interpret RDQA graphs Bar charts Spider graphs Line graphs

Ongoing Monitoring & Follow up The RDQA Process Data Verification System Assessment Interpret the Output Develop Action Plans Disseminate Results Ongoing Monitoring & Follow up

Charts & Graphs Charts and graphs summarize information visually Advantage: Information is easier to understand Disadvantage: You might loose some of the detail RDQA tool automatically generates summary graphs of assessment data Open the RDQA Tool to show a quick overview of each worksheet

Basic Guidance for Building Graphs Ensure graph has a title Label the components of your graph Indicate source of data with date Provide number of observations (n=xx) as a reference point Add footnote if more information is needed

Interpreting Graphs The summary statistics that are calculated include the following: Strength of the Data Management and Reporting System based on a review of the programme’s data collection and reporting system Accuracy of Reported Data through the Calculation of Verification Factors generated from the recounting exercise performed at each level of the reporting system Availability, Completeness and Timeliness of Reports through percentages calculated at the Health District(s) and the national level

Type of Graphs in the RDQA Tool Column or bar chart – used for comparing categories of data Spider graph – used to compare categories in an alternate format

Bar Charts in the RDQA Tool Generated on the Global Dashboard, intermediate aggregation and service delivery point sheets Include charts for both data verification and system assessment outputs At the intermediate aggregation and Global level, data on reporting performance is also on a chart.

Output of Data Verification Verification factor Over-reporting Under-reporting Recommended range of acceptability: 100%+/-10% (90-110%) Numerator: Recounted Data Denominator: Reported Data

Bar Chart – Overall Average Under-reporting Over-reporting

Bar Charts – Disaggregated by Level Do we have under-reporting or over-reporting in this bar chart?

Reporting Performance

Spider Graph Visual display of information on various axis What matters are the points on the axis NOT the area

Spider Graph – How to draw one Categories Score M&E capabilities, roles and responsibilities 2.5 Training 3.0 Indicator definition 1.0 Data reporting requirements 1.5 Data collection, reporting forms and tools 2.0 Data mngt process and quality control Links with national reporting systems 0.5 3 2 1 Step 1 – draw the scores of each category on the graph

Interpreting the Graph M&E capabilities, roles and responsibilities Step 2 – Connect the dots. Focus on the points/scores not on the area Links with national reporting systems Training Data mngt process and quality control Indicator Definition Data collection, reporting forms and tools Data reporting requirements

Line Graphs Track outcomes from repeat data verifications and system assessments Important for routine monitoring Chart data by service delivery site and by district Require you to build the graph yourself

Routine Monitor Line Graph How would you interpret this graph?

Routine Monitor Line Graph Under-reporting Over-reporting

Questions & Answers

MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide.

www.measureevaluation.org