Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012.

Similar presentations


Presentation on theme: "Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012."— Presentation transcript:

1 Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

2 ■National programs and donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and malaria. ■Measuring success and improving management of these initiatives are based on strong M&E system that produce quality data regarding program implementation. ■As a result of strategies like “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched in mid-2006 to develop a joint Routine Data Quality Assessment (RDQA) Tool. ■The objective of this initiative was to provide a common approach for assessing and improving data quality (between partners and with National Programs). * Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE Evaluation. Background - 1

3 ■Importantly, funding is tied to performance and need to show effectiveness of interventions ■Hence, the need for quality data is imperative to show program effectiveness ■Interestingly, single indicator-RDQA tool was used for Joint national DQA exercises in 2008, 2009, 2011 and 2012 (led by NACA) ■Multi-indicator tool has never been used in the country and there is a need to sensitize M&E professionals of the potential opportunities in this tool ■This tool provides opportunity to evaluate data quality for selected priority indicators in different program areas at the same time and identify areas for improvement Background - 2

4 Countries where RDQA has been used or is currently being implemented  Kenya  Tanzania  South Africa, Lesotho, Swaziland  Nigeria  Cote d’Ivoire  DRC  Haiti  Mozambique  India  Botswana  Global Fund On Site Data Verification (OSDV) by LFAs in many countries

5 ■Refers to the worth/accuracy of the information collected & focuses on ensuring that the process of data capturing, verifying and analysis is of a high standard. ■RDQA tools facilitate this process and also provide opportunity for capacity building Data Quality

6  Mistake should be prevented rather than detected  Correcting data that has been wrongly recorded is difficult and expensive  The quality of the data is largely determined by how well the data are collected and forms are completed  In the presence of errors, data cannot be interpreted – useless!  Increased Data Quality  Increased reliability and usability Why Data Quality is Important - I

7  Program planning  Data use  Program decision making  Sharing program information  Reporting/Accountability Why Data Quality is Important - II

8  Data Quality Assessment involve checking data against several criteria/dimensions o Validity o Integrity o Reliability o Timeliness o Completeness o Precision o Confidentiality  DQA tool is used to assess the quality of the data and should be responsive to meeting the seven dimensions  Assessment helps us to determine areas of poor data quality & help come up with action plans for potential solutions. Data Quality Assurance

9 VERIFY the quality of reported data for key indicators at selected sites ASSESS the ability of data-management systems to collect, manage and report quality data. IMPLEMENT measures with appropriate action plans for strengthening the data management and reporting system and improving data quality. MONITOR capacity improvements and performance of the data management and reporting system to produce quality data. Objectives of RDQA

10  Routine data quality checks as part of on-going supervision  Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time  Strengthening program staff’s capacity in data management and reporting  External assessment by partners and other stakeholders Uses of RDQA

11 11 Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data. REPORTING LEVELS Service Points Intermediate Aggregation Levels (e.g. LGAs, States) M&E Unit QUALITY DATA Accuracy, Completeness, Reliability, Timeliness, Confidentiality, Precision, Integrity Dimensions of Quality Data quality mechanisms and controlsVII Data management processesVI Links with National Reporting SystemV Links with the national reporting systemVIII Data Management ProcessesIV Data-collection and Reporting Forms / ToolsIII Indicator Definitions and Reporting GuidelinesII M&E Structure, Functions and CapabilitiesI Functional Components of a Data Management System Needed to Ensure Data Quality Data-Management and Reporting System Conceptual Framework of DQA

12 PREPARATION PHASE 1 IMPLEMENTATION PHASE 2 ACTION PLAN PHASE 3 FOLLOW UP PHASE 4 2. Determine indicators, data sources and time period 5. Verify data 4. Assess data management system 1. Determine scope of the DQA 3. Determine and notify facilities/sites 6. Summarize findings and prepare action plan 7. Implement activities and follow up Implementation is conducted at M&E Unit, service sites and intermediate aggregation levels, as appropriate, given the scope of the DQA RDQA Methodology: Chronology and Steps

13 ■The methodology for the DQA includes two (2) protocols: Data Verifications (Protocol 1) Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports. 1 Assessment of Data Management Systems (Protocol 2) Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system. 2 RDQA Methodology: Protocols

14  Data Verification  Documentation Review  Recounted results – trace and verify  Cross checks – compare with alternative data sources  Reporting Performance  Timeliness, completeness, availability (Intermediate level and higher)  System Assessment  Are elements in place to ensure quality reporting? RDQA Methodology: Protocols

15 ■PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and reporting data accurately and on time. ■The data verification step takes place in two stages: -In-depth verifications at the Service Delivery Points; and -Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit. Trace and verify Indicator Data M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) 5.Trace and Verify Reported Results RDQA Methodology: Data Verification Component

16 Service Delivery Site 5 Monthly Report ARV Nb.50 Service Delivery Site 6 Monthly Report ARV Nb.200 Source Document 1 District 1 Monthly Report SDS 145 SDS 220 TOTAL65 District 4 Monthly Report SDP 550 SDP 6200 TOTAL250 District 3 Monthly Report SDS 475 TOTAL75 M&E Unit/National Monthly Report District 165 District 375 TOTAL435 District 4250 ILLUSTRATION Service Delivery Site 3 Monthly Report ARV Nb.45 Source Document 1 Service Delivery Site 4 Monthly Report ARV Nb.75 Source Document 1 Service Delivery Site 1 Monthly Report ARV Nb.45 Source Document 1 Service Delivery Site 2 Monthly Report ARV Nb.20 Source Document 1 District 2 Monthly Report SDS 345 TOTAL45 District 245 RDQA Methodology: Data Verification

17 SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS VerificationsDescription- Verification n o. 1: Documentation Review Review availability and completeness of all indicator source documents for the selected reporting period. In all cases Verification n o. 2: Data Verification Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences. In all cases Verification n o. 3: Cross-checks Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory reports, etc.). If feasible Service Delivery Points – Data Verification

18 CROSS CHECKS - Perform cross-checks of the verified report totals with other data-sources Indicator-specific notes for auditor: Cross checking may be done by comparing (1) Patient Treatment Cards and the ART Register; and (2) Drug Stock Records and the ART Register. The code of the regimen dispensed to the patient is recorded in the ART Register. The exact number of patients receiving each regimen in the facility at any time can therefore be counted by reviewing the ART Register. CROSS-CHECK 1.1 : From Patient Treatment Cards to the ART Register. Was this cross check performed? Yes 4.1 If feasible, select 5% of Patient Treatment Cards (or at least 20 cards) who are currently on treatment. How many cards were selected? 5 4.2 How many of the patients selected were recorded in the ART Register? 3 Calculate % difference for cross check 1.1 If difference is below 90%, select an additional 5% of Patient Treatment Cards (or at least an extra 10 cards) and redo the calculation (ADD the numbers to the existing numbers in the above cells); repeat up to three times. 60.0% Service Delivery Points – Cross Checks

19 Assessment of Data Management and Reporting Systems M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) Assess Data Management and Reporting Systems ■PURPOSE: Identify potential risks to data quality created by the data- management and reporting systems at: -the M&E Management Unit; -the Service Delivery Points; -any Intermediary Aggregation Level (District or Region). ■The RDQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems. ■The assessment covers 8 functional areas (HR, Training, Data Management Processes, etc.) RDQA Methodology: Systems Assessment Component

20 SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA Functional AreasSummary Questions IM&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly assigned responsibilities? IIData Management Processes 2 Does clear documentation of collection, aggregation and manipulation steps exist? IIILinks with National Reporting System 3 Does the data collection and reporting system of the Program/Project link to the National Reporting System? IVIndicator Definitions 4 Are there operational indicator definitions meeting relevant standards and are they systematically followed by all service points? VData-collection and Reporting Forms and Tools 5 Are there standard data-collection and reporting forms that are systematically used? 6 Are source documents kept and made available in accordance with a written policy? Functional Areas of an M&E System that affect Data Quality

21 RDQA System Assessment

22 1- Strength of the M&E System, evaluation based on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented; 2- Verification Factors generated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value); 3- Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Level and the M&E Unit). 4- Action Plan for System Strengthening for each level assessed. RDQA Outputs

23 RDQA Summary Statistics – Level Specific Dashboard

24 RDQA Summary Statistics – Global Dashboard

25 REPORTING LEVEL FINDINGSRECOMMENDATIONS National M&E Unit  No specific documentation specifying data-management roles and responsibilities, reporting timelines, standard forms, storage policy, …  Develop a data management manual to be distributed to all reporting levels  Inability to verify reported numbers by the M&E Unit because too many reports (from Service Points) are missing (67%)  Systematically file all reports from Service Points  Develop guidelines on how to address missing or incomplete reports  Most reports received by the M&E Unit are not signed-off by any staff or manager from the Service Point  Reinforce the need for documented review of submitted data – for example, by not accepting un- reviewed reports ILLUSTRATION Example of Systems’ Finding at the M&E Unit (HIV/AIDS)

26 REPORTING LEVEL FINDINGSRECOMMENDATIONS Intermediate Aggregation Level  Inability to retrieve source documents (i.e., treatment forms) for a specific period  Improve source document storage process by clearly identifying stored source document by date Service Points  Confusion regarding the definition of a patient “lost to follow-up” (3 months for Temeke Hospital; 2 months for Iringa Hospital).  The M&E Unit should clearly communicate to all service points the definition of a patient “lost to follow up”  The service points do not systematically remove patients “lost to follow up” from counts of numbers of people on ART  Develop a mechanism to ensure that patients “lost to follow up” are systematically removed from the counts of numbers of people on ART  In cases of "satellite sites“, the reporting system and source documents do not always identify the location of a patient  Develop a coding system that clearly identifies a patient’s treatment location so that data verification can be accomplished

27 Multi-Indicator RDQA Tool

28 Thank you…

29 MEASURE Evaluation is a MEASURE project funded by the U.S. Agency for International Development and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide. Visit us online at http://www.cpc.unc.edu/measure


Download ppt "Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012."

Similar presentations


Ads by Google