Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implementing the Data Quality Assessment Tool

Similar presentations


Presentation on theme: "Implementing the Data Quality Assessment Tool"— Presentation transcript:

1 Implementing the Data Quality Assessment Tool
Data Systems Quality Implementing the Data Quality Assessment Tool

2 Session Overview Why data quality matters Dimensions of data quality
Thoughts about improving data quality Data Quality Assurance Tool Activity: Implementing the Tool

3 Data Management System
Data Quality The REAL world In the real world, project activities are implemented in the field. These activities are designed to produce results that are quantifiable. Data Management System An information system represents these activities by collecting the results that were produced and mapping them to a recording system. Data Quality: How well the DMS represents the real world ? Data Management System The real world can be thought of as the services that are being delivered by your program/project/intervention We want our data management system to reflect the real world. Think of a mirror – a perfect, high quality mirror v. a convex/concave/rolling mirror How well do our data collection tools (mirrors) reflect what is really happening in our programs? Real World

4 Testing and Counseling: How are these data collected?
A person walks into the facility Facility registration How aggregated at facility level? How forwarded to next level? How forwarded to national level? How forwarded to international level?

5 ARV Treatment: How are these data collected?
A person tests positive for HIV When begin receiving ARV? How recorded in facility records? How aggregated at facility level? How aggregated at next level? How aggregated at national level? How aggregated at international level?

6 OVC Care: How are these data collected?
A child is identified as being an orphan or vulnerable—how? Receives care from an organization—which ones? How many? How recorded at organizational level? How aggregated at next level? How aggregated at national level? How aggregated at international level? How do we know that child did not receive care from more than one organization?

7 Why is data quality important?
Governments and donors collaborating on “Three Ones” Accountability for funding and results reported increasingly important Quality data needed at program level for management decisions

8 Data quality and PEPFAR/GFATM
Target Setting Improved Program & Resource Management Data Quality Data drives this continuous process that is shown in the slide…good quality data is the foundation upon which we make good decisions: improving program and resource mmgt so that we may set realistic targets! Results Reporting

9 Data Quality REAL WORLD INFORMATION SYSTEM
In the real world, project activities are implemented in the field. These activities are designed to produce results that are quantifiable. INFORMATION SYSTEM An information system represents these activities by collecting the results that were produced and mapping them to a recording system. Data Quality: How well the information system represents the real world Data Quality Real World Information System 1. Accuracy 2. Reliability 3. Completeness 4. Precision 5. Timeliness 6. Integrity

10 Dimensions of Data Quality
Validity Valid data are considered accurate: They measure what they are intended to measure. Reliability The data are measured and collected consistently. Completeness Completely inclusive: an information system represents the complete list of eligible names and not a fraction of the list. Precision The data have sufficient detail. Timeliness Data are up-to-date (current), and information is available on time. Integrity The data are protected from deliberate bias or manipulation for political or personal reasons.

11 Validity/Accuracy: Questions to ask…
What is the relationship between the activity/program & what you are measuring? What is the data transcription process? Is there potential for error? Are steps being taken to limit transcription error double keying of data for large surveys, built in validation checks, random checks

12 Reliability: Questions to ask…
Is the same instrument used from year to year, site to site? Is the same data collection process used from year to year, site to site? Are procedures in place to ensure that data are free of significant error and that bias is not introduced (e.g., instructions, indicator reference sheets, training, etc.)?

13 Reliability: Questions to ask…
If there are data errors, what do you do with that information? If raw data need to be manipulated, are the correct formulae being applied—across site and consistently? How to handle missing/incomplete data? Are final numbers reported accurately—does the total add up?

14 Completeness: Questions to ask
Are the data from all sites that are to report included in aggregate data? If not, which sites are missing? Is there a pattern to the sites that were not included in the aggregation of data? What steps are taken to ensure completeness of data?

15 Precision: Questions to ask…
How is margin of error being addressed? Are the margins of error acceptable for program decision making? Have issues around precision been reported? Would an increase in the degree of accuracy be more costly than the increased value of the information?

16 Timeliness: Questions to ask…
Are data available on a frequent enough basis to inform program management decisions? Is a regularized schedule of data collection in place to meet program management needs? Are data from within the policy period of interest (i.e. are the data from a point in time after the intervention has begun)? Are the data reported as soon as possible after collection?

17 Integrity: Questions to ask…
Are there risks that data are manipulated for personal or political reasons? What systems are in place to minimize such risks? Has there been an independent review?

18 During this workshop, think about…
How well does your information system function? Are the definitions of indicators clear and understood at all levels? Do individuals and groups understand their roles and responsibilities? Does everyone understand the specific reporting timelines—and why they need to be followed?

19 …Keep thinking about… Are data collection instruments and reporting forms standardized and compatible? Do they have clear instructions? Do you have documented data review procedures for all levels…and use them? Are you aware of potential data quality challenges, such as missing data, double counting, lost to follow up? How do you address them? What are your policies and procedures for storing and filing data collection instruments?

20 Data Quality Assessment Tool
For Assessment & Capacity Building Another way to assess data quality is through a data quality assessment.

21 Purpose of the DQA The Data-Quality Assessment (DQA) Protocol is designed: to verify that appropriate data management systems are in place in countries; to verify the quality of reported data for key indicators at selected sites; and to contribute to M&E systems strengthening and capacity building.

22 DQA Components Determine scope of the data quality assessment
Suggested criteria for selecting Program/project(s) & indicators Engage Program/project(s), obtain authorization for DQA Templates for notifying the Program/project of the assessment Guidelines for obtaining authorization to conduct the assessment Assess the design & implementation of the Program/project’s data collection and reporting systems. Steps & protocols to ID potential threats to data quality created by Program/project’s data management & reporting system

23 DQA Components Trace & verify (recount) selected indicator results
Protocol with special instructions based on indicator & type of Service Delivery Site (e.g. health facility or community-based) Develop and present the assessment Team’s findings and recommendations. instructions on how and when to present the DQA findings recommendations to Program/project officials for how to plan for follow-up activities to ensure strengthening measures are implemented

24 Example: Indicator Selection
DISEASE INDICATORS REPORTING PERIOD HIV/AIDS Number of patients on ARV 3-month period [1-Nov-05 / 31-Jan-06] National Numbers TB Number of smear positive TB cases registered under DOTS who are successfully treated 3-month period [1-Oct Dec-04] Malaria Number of insecticide-treated bed nets (ITNs) distributed (i.e., number of vouchers redeemed) 6-month period [1-Nov-2005 / 30-Apr-2006] Reported numbers to Global Fund

25 Chronology and Steps of the DQA
PHASE 1 PHASE 2 PHASE 3 PHASE 4 PHASE 5 PHASE 6 Preparation and Initiation (multiple locations) M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) M&E Management Unit Completion (multiple locations) Assess Data Management and Reporting Systems Draft initial findings and conduct close-out meeting Select Indicators and Reporting Period Draft and discuss assessment Report Obtain National Authorizations and notify Program Select/Confirm Service Delivery Points to be visited Trace and Verify Reported Results Initiate follow-up of recommended actions The DQA is implemented chronologically in 6 Phases. Assessments and verifications will take place at every stage of the reporting system: M&E Management Unit Intermediate Aggregation Level (Districts, Regions) Service Delivery Sites.

26 DQA Outputs Completed protocols and templates
Part DQA Tool. Write-ups of observations, interviews, and conversations Key data quality officials at the M&E Unit Intermediary reporting locations & Service Delivery Sites Preliminary findings, draft recommendations notes Based on evidence collected in protocols Final assessment Report Summarizes evidence collected IDs specific assessment findings & gaps related to evidence Includes recommendations to improve data quality directly linked to assessment findings Summary statistics calculated from systems & data verification protocols

27 DQA Outputs Strength of the M&E System Verification Factors
Evaluation based on review of data management & reporting system including summary responses on system design & implementation Verification Factors Generated from trace & verify recounting exercise performed on primary records/aggregated reports % comparison of reported numbers to the verified numbers Available, timely & complete reports percentages Calculated at Intermediate aggregation level and the M&E unit Summary stats developed from systems & data verification protocols All follow-up communication with program/project related to results and recommendations of DQA

28 PROTOCOL 1: Assessment of Data Management and Reporting Systems M&E Management Unit PHASE 2 Service Delivery Sites / Organizations PHASE 3 Intermediate Aggregation levels (eg. District, Region) PHASE 4 Assess Data Management and Reporting Systems Purpose ID potential risks to data quality created by data management & reporting systems at: M&E Management Unit; Service Delivery Points; Intermediary Aggregation Levels (District or Region) The DQA assesses both design and implementation of data-management & reporting systems. Assessment covers 8 functional areas (HR, Training, Data Management Processes , etc.)

29 Functional Areas of M&E System that affect Data Quality
SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA Functional Areas Summary Questions I M&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly assigned responsibilities? II Training 2 Have the majority of key M&E and data-management staff received the required training? III Data Reporting Requirements 3 Has the Program/Project clearly documented (in writing) what is reported to who, and how and when reporting is required? IV Indicator Definitions 4 Are there operational indicator definitions meeting relevant standards and are they systematically followed by all service points? V Data-collection and Reporting Forms and Tools 5 Are there standard data-collection and reporting forms that are systematically used? 6 Are source documents kept and made available in accordance with a written policy?

30 Functional Areas of an M&E System that Affect Data Quality
VI Data Management Processes 7 Does clear documentation of collection, aggregation and manipulation steps exist? VII Data Quality Mechanisms and Controls 8 Are data quality challenges identified and are mechanisms in place for addressing them? 9 Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? 10 Are there clearly defined and followed procedures to periodically verify source data? VIII Links with National Reporting System 11 Does the data collection and reporting system of the Program/Project link to the National Reporting System?

31 Trace and verification exercise - two stages:
PROTOCOL 2: Trace and verify Indicator Data M&E Management Unit PHASE 2 Service Delivery Sites / Organizations PHASE 3 Intermediate Aggregation levels (eg. District, Region) PHASE 4 Trace and Verify Reported Results PURPOSE: Assess on limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting & reporting data accurately and on time. Trace and verification exercise - two stages: In-depth verifications at the Service Delivery Points; and Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit.

32 DQA Protocol 2: Trace and Verification
M&E Unit/National Monthly Report District 1 65 District 2 45 District 3 75 District 4 250 TOTAL 435 District 1 Monthly Report SDS 1 45 SDS 2 20 TOTAL 65 District 2 Monthly Report SDS 3 45 TOTAL District 3 Monthly Report SDS 4 75 TOTAL District 4 Monthly Report SDP 5 50 SDP 6 200 TOTAL 250 Service Delivery Site 1 Service Delivery Site 2 Monthly Report ARV Nb. 20 Service Delivery Site 3 Monthly Report ARV Nb. 45 Source Document 1 Service Delivery Site 4 Monthly Report ARV Nb. 75 Source Document 1 Service Delivery Site 5 Monthly Report ARV Nb. 50 Service Delivery Site 6 Monthly Report Monthly Report ARV Nb. This chart shows visually that the trace and verify protocol starts with data at the level of service delivery (either in a health facility or a community based program) and how the data are “traced” to the “intermediate aggregation level” (in this case a district) and then to the M&E Unit Level (in this case the national level). 45 ARV Nb. 200 Source Document 1 Source Document 1 Source Document 1 Source Document 1

33 Service Delivery Points – Data Verification
SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS Verifications Description - Verification no. 1: Describe the connection between the delivery of services/commodities and the completion of the source document that records that service delivery. In all cases Verification no. 2: Documentation Review Review availability and completeness of all indicator source documents for the selected reporting period. Verification no. 3: Trace and Verification Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences. Verification no. 4: Cross-checks Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory reports, etc.). If feasible Verification no. 5: Spot checks Perform “spot checks” to verify the actual delivery of services or commodities to the target populations. At the service delivery level, these five verifications are done as part of the trace and verify protocol.

34 DQA Summary Statistics
These charts show the summary statistics that are automatically generated from the trace and verify protocol of the data quality assessment tool. They show for the reporting period and indicator assessmented, the: Verification factor (how the recounted data compare to the reported data) Availability of Reports Timeliness of Reports Completeness of Reports

35 Illustration 1 - Trace and Verification at the M&E Unit (HIV/AIDS)
Number of patients on ARV - 31st of August 2006 44,7% (21,449 Recounted) 55,3% (26,654 Unaccounted) 20% 40% 60% 80% 100% 1- VERIFICATION FACTOR (% difference in the reported / re-aggregated numbers) 41% Incomplete 69% Complete No tracking of timeliness Availability 67% Missing 33% Available 20% 40% 60% 80% 100% Completeness * Timeliness 2- AVAILABILITY, COMPLETENESS AND TIMELINESS OF REPORTS * Report has to include (1) Name of site; (2) Reporting Period; (3) Name of submitting person; (4) Cumulative data. Findings from country where pilot-tested

36 Illustration 3 – Systems’ Finding at the M&E Unit (HIV/AIDS)
REPORTING LEVEL FINDINGS RECOMMENDATIONS National M&E Unit No specific documentation specifying data-management roles and responsibilities, reporting timelines, standard forms, storage policy, … Develop a data management manual to be distributed to all reporting levels Inability to verify reported numbers by the M&E Unit because too many reports (from Service Points) are missing (67%) Systematically file all reports from Service Points Develop guidelines on how to address missing or incomplete reports Most reports received by the M&E Unit are not signed-off by any staff or manager from the Service Point Reinforce the need for documented review of submitted data – for example, by not accepting un-reviewed reports

37 Findings from DQAs Data not collected routinely - ‘reporting flurry’
Documentation of what was reported (can’t locate source documents/lack of filing system for easy retrieval) Issues around double-counting Integrity – incentives for over-reporting Effect of staff turnover Involving staff in M&E – definitions of indicators, value of data, data use Here are some common themes from data quality assessments and assessments.

38 MEASURE Evaluation is a MEASURE project funded by the
U.S. Agency for International Development and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide.


Download ppt "Implementing the Data Quality Assessment Tool"

Similar presentations


Ads by Google