Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012.

Slides:



Advertisements
Similar presentations
DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Advertisements

Introduction to Monitoring and Evaluation
Data Quality Considerations
Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Donald T. Simeon Caribbean Health Research Council
Rwanda Case Study Additional Slides on Stakeholder Involvement.
 Capacity Development; National Systems / Global Fund Summary of the implementation capacities for National Programs and Global Fund Grants For HIV /TB.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Begin with the End in Mind
Compendium of Indicators for Monitoring and Evaluating National Tuberculosis Programs Using the Compendium to Plan for Monitoring and Evaluation of NTPs.
Comprehensive M&E Systems
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Indicators, Data Sources, and Data Quality for TB M&E
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Implementing the Data Quality Assessment Tool
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
M&E System Strengthening Tool Workshop on effective Global Fund Grant negotiation and implementation planning January 2008 Manila, Philippines Monitoring.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
EXTERNAL DQA METHODOLOGY AND IMPLEMENTATION ​ Mozambique Strategic Information Project (MSIP) ​ JSI Research & Training Institute, Inc. (JSI) in collaboration.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Results orientation: audit perspective Jiri Plecity, Head of Unit H1, Relations with Control Authorities, Legal Procedures, Audit of Direct Management.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
STOCK MANAGEMENT, INVENTORY AND ACCOUNTABILITY STANDARDS: WHAT WE HAVE LEARNT, GOOD PRACTICE STANDARDS.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Compendium of Indicators for Monitoring and Evaluating National Tuberculosis Programs.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Improving health worldwide Implications for Monitoring of the HIV Care Cascade? Jim Todd MeSH Satellite Session IAS Durban, Monday 18 th.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Ensuring Data Quality for Monitoring and Evaluation
Data Quality Assurance Workshop
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Session: 8 Disseminating Results
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Research & Evaluation Improving measurement for improvement
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Data Collection/Cleaning/Quality Processes MISAU Experience in Mozambique September 2017.
Assessment Training Session 9: Assessment Analysis
Conceptual Introduction to the RDQA
V F Kemerer Getting To Results IDEAS Global Assembly
Session: 6 Understanding & Using the RDQA Tool Output
Measuring Data Quality
Session: 9 On-going Monitoring & Follow Up
Comprehensive M&E Systems
Presentation transcript:

Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■National programs and donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and malaria. ■Measuring success and improving management of these initiatives are based on strong M&E system that produce quality data regarding program implementation. ■As a result of strategies like “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched in mid-2006 to develop a joint Routine Data Quality Assessment (RDQA) Tool. ■The objective of this initiative was to provide a common approach for assessing and improving data quality (between partners and with National Programs). * Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE Evaluation. Background - 1

■Importantly, funding is tied to performance and need to show effectiveness of interventions ■Hence, the need for quality data is imperative to show program effectiveness ■Interestingly, single indicator-RDQA tool was used for Joint national DQA exercises in 2008, 2009, 2011 and 2012 (led by NACA) ■Multi-indicator tool has never been used in the country and there is a need to sensitize M&E professionals of the potential opportunities in this tool ■This tool provides opportunity to evaluate data quality for selected priority indicators in different program areas at the same time and identify areas for improvement Background - 2

Countries where RDQA has been used or is currently being implemented  Kenya  Tanzania  South Africa, Lesotho, Swaziland  Nigeria  Cote d’Ivoire  DRC  Haiti  Mozambique  India  Botswana  Global Fund On Site Data Verification (OSDV) by LFAs in many countries

■Refers to the worth/accuracy of the information collected & focuses on ensuring that the process of data capturing, verifying and analysis is of a high standard. ■RDQA tools facilitate this process and also provide opportunity for capacity building Data Quality

 Mistake should be prevented rather than detected  Correcting data that has been wrongly recorded is difficult and expensive  The quality of the data is largely determined by how well the data are collected and forms are completed  In the presence of errors, data cannot be interpreted – useless!  Increased Data Quality  Increased reliability and usability Why Data Quality is Important - I

 Program planning  Data use  Program decision making  Sharing program information  Reporting/Accountability Why Data Quality is Important - II

 Data Quality Assessment involve checking data against several criteria/dimensions o Validity o Integrity o Reliability o Timeliness o Completeness o Precision o Confidentiality  DQA tool is used to assess the quality of the data and should be responsive to meeting the seven dimensions  Assessment helps us to determine areas of poor data quality & help come up with action plans for potential solutions. Data Quality Assurance

VERIFY the quality of reported data for key indicators at selected sites ASSESS the ability of data-management systems to collect, manage and report quality data. IMPLEMENT measures with appropriate action plans for strengthening the data management and reporting system and improving data quality. MONITOR capacity improvements and performance of the data management and reporting system to produce quality data. Objectives of RDQA

 Routine data quality checks as part of on-going supervision  Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time  Strengthening program staff’s capacity in data management and reporting  External assessment by partners and other stakeholders Uses of RDQA

11 Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data. REPORTING LEVELS Service Points Intermediate Aggregation Levels (e.g. LGAs, States) M&E Unit QUALITY DATA Accuracy, Completeness, Reliability, Timeliness, Confidentiality, Precision, Integrity Dimensions of Quality Data quality mechanisms and controlsVII Data management processesVI Links with National Reporting SystemV Links with the national reporting systemVIII Data Management ProcessesIV Data-collection and Reporting Forms / ToolsIII Indicator Definitions and Reporting GuidelinesII M&E Structure, Functions and CapabilitiesI Functional Components of a Data Management System Needed to Ensure Data Quality Data-Management and Reporting System Conceptual Framework of DQA

PREPARATION PHASE 1 IMPLEMENTATION PHASE 2 ACTION PLAN PHASE 3 FOLLOW UP PHASE 4 2. Determine indicators, data sources and time period 5. Verify data 4. Assess data management system 1. Determine scope of the DQA 3. Determine and notify facilities/sites 6. Summarize findings and prepare action plan 7. Implement activities and follow up Implementation is conducted at M&E Unit, service sites and intermediate aggregation levels, as appropriate, given the scope of the DQA RDQA Methodology: Chronology and Steps

■The methodology for the DQA includes two (2) protocols: Data Verifications (Protocol 1) Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports. 1 Assessment of Data Management Systems (Protocol 2) Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system. 2 RDQA Methodology: Protocols

 Data Verification  Documentation Review  Recounted results – trace and verify  Cross checks – compare with alternative data sources  Reporting Performance  Timeliness, completeness, availability (Intermediate level and higher)  System Assessment  Are elements in place to ensure quality reporting? RDQA Methodology: Protocols

■PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and reporting data accurately and on time. ■The data verification step takes place in two stages: -In-depth verifications at the Service Delivery Points; and -Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit. Trace and verify Indicator Data M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) 5.Trace and Verify Reported Results RDQA Methodology: Data Verification Component

Service Delivery Site 5 Monthly Report ARV Nb.50 Service Delivery Site 6 Monthly Report ARV Nb.200 Source Document 1 District 1 Monthly Report SDS 145 SDS 220 TOTAL65 District 4 Monthly Report SDP 550 SDP 6200 TOTAL250 District 3 Monthly Report SDS 475 TOTAL75 M&E Unit/National Monthly Report District 165 District 375 TOTAL435 District 4250 ILLUSTRATION Service Delivery Site 3 Monthly Report ARV Nb.45 Source Document 1 Service Delivery Site 4 Monthly Report ARV Nb.75 Source Document 1 Service Delivery Site 1 Monthly Report ARV Nb.45 Source Document 1 Service Delivery Site 2 Monthly Report ARV Nb.20 Source Document 1 District 2 Monthly Report SDS 345 TOTAL45 District 245 RDQA Methodology: Data Verification

SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS VerificationsDescription- Verification n o. 1: Documentation Review Review availability and completeness of all indicator source documents for the selected reporting period. In all cases Verification n o. 2: Data Verification Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences. In all cases Verification n o. 3: Cross-checks Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory reports, etc.). If feasible Service Delivery Points – Data Verification

CROSS CHECKS - Perform cross-checks of the verified report totals with other data-sources Indicator-specific notes for auditor: Cross checking may be done by comparing (1) Patient Treatment Cards and the ART Register; and (2) Drug Stock Records and the ART Register. The code of the regimen dispensed to the patient is recorded in the ART Register. The exact number of patients receiving each regimen in the facility at any time can therefore be counted by reviewing the ART Register. CROSS-CHECK 1.1 : From Patient Treatment Cards to the ART Register. Was this cross check performed? Yes 4.1 If feasible, select 5% of Patient Treatment Cards (or at least 20 cards) who are currently on treatment. How many cards were selected? How many of the patients selected were recorded in the ART Register? 3 Calculate % difference for cross check 1.1 If difference is below 90%, select an additional 5% of Patient Treatment Cards (or at least an extra 10 cards) and redo the calculation (ADD the numbers to the existing numbers in the above cells); repeat up to three times. 60.0% Service Delivery Points – Cross Checks

Assessment of Data Management and Reporting Systems M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) Assess Data Management and Reporting Systems ■PURPOSE: Identify potential risks to data quality created by the data- management and reporting systems at: -the M&E Management Unit; -the Service Delivery Points; -any Intermediary Aggregation Level (District or Region). ■The RDQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems. ■The assessment covers 8 functional areas (HR, Training, Data Management Processes, etc.) RDQA Methodology: Systems Assessment Component

SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA Functional AreasSummary Questions IM&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly assigned responsibilities? IIData Management Processes 2 Does clear documentation of collection, aggregation and manipulation steps exist? IIILinks with National Reporting System 3 Does the data collection and reporting system of the Program/Project link to the National Reporting System? IVIndicator Definitions 4 Are there operational indicator definitions meeting relevant standards and are they systematically followed by all service points? VData-collection and Reporting Forms and Tools 5 Are there standard data-collection and reporting forms that are systematically used? 6 Are source documents kept and made available in accordance with a written policy? Functional Areas of an M&E System that affect Data Quality

RDQA System Assessment

1- Strength of the M&E System, evaluation based on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented; 2- Verification Factors generated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value); 3- Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Level and the M&E Unit). 4- Action Plan for System Strengthening for each level assessed. RDQA Outputs

RDQA Summary Statistics – Level Specific Dashboard

RDQA Summary Statistics – Global Dashboard

REPORTING LEVEL FINDINGSRECOMMENDATIONS National M&E Unit  No specific documentation specifying data-management roles and responsibilities, reporting timelines, standard forms, storage policy, …  Develop a data management manual to be distributed to all reporting levels  Inability to verify reported numbers by the M&E Unit because too many reports (from Service Points) are missing (67%)  Systematically file all reports from Service Points  Develop guidelines on how to address missing or incomplete reports  Most reports received by the M&E Unit are not signed-off by any staff or manager from the Service Point  Reinforce the need for documented review of submitted data – for example, by not accepting un- reviewed reports ILLUSTRATION Example of Systems’ Finding at the M&E Unit (HIV/AIDS)

REPORTING LEVEL FINDINGSRECOMMENDATIONS Intermediate Aggregation Level  Inability to retrieve source documents (i.e., treatment forms) for a specific period  Improve source document storage process by clearly identifying stored source document by date Service Points  Confusion regarding the definition of a patient “lost to follow-up” (3 months for Temeke Hospital; 2 months for Iringa Hospital).  The M&E Unit should clearly communicate to all service points the definition of a patient “lost to follow up”  The service points do not systematically remove patients “lost to follow up” from counts of numbers of people on ART  Develop a mechanism to ensure that patients “lost to follow up” are systematically removed from the counts of numbers of people on ART  In cases of "satellite sites“, the reporting system and source documents do not always identify the location of a patient  Develop a coding system that clearly identifies a patient’s treatment location so that data verification can be accomplished

Multi-Indicator RDQA Tool

Thank you…

MEASURE Evaluation is a MEASURE project funded by the U.S. Agency for International Development and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide. Visit us online at