MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.

Slides:



Advertisements
Similar presentations
DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Advertisements

Data Quality Considerations
Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Begin with the End in Mind
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Indicators, Data Sources, and Data Quality for TB M&E
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Implementing the Data Quality Assessment Tool
U SING G EOGRAPHY TO S TRENGTHEN THE D ECISION -M AKING P ROCESS.
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
PLACE Method Priorities for Local AIDS Control Efforts 1 1 MEASURE Evaluation, A Manual for implementing the PLACE Method.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
INTRODUCTION TO INFORMATION SYSTEMS FOR IMMUNIZATION SERVICES IPV Global Workshop March 2014.
Session 2: Developing a Comprehensive M&E Work Plan.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Management of RHIS Resources
Monitoring and Evaluation (M&E) Plan
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Ensuring Data Quality for Monitoring and Evaluation
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Session: 8 Disseminating Results
ROUTINE HEALTH INFORMATION SYSTEMS
Measuring Success Toolkit
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Assessment Training Session 9: Assessment Analysis
Introduction to the Health Information System
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Conceptual Introduction to the RDQA
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Introduction to the PRISM Framework
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Presentation transcript:

MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment

Topics to Cover  RDQA vs DQA tools  Data quality responsibilities by level  Introduction to RDQA Objectives Methodology Core components Timeline Preparation for conducting an RDQA  After the RDQA

DQA and RDQA DQA & RDQA were developed for verifying the quality of reported data and assessing data management & reporting systems Data Quality AuditRoutine Data Quality Assessment FunctionEvaluationMonitoring By WhomExternal audit teamProgramme/project team DistinctionHighly structured, external audit Simplified version of DQA for use by programmes

Data Quality Tools  Routine Data Quality Assessment Tools: The Single Indicator tool -- used to assess 1 indicator only The Multi Indicator tool -- used when 1-4 indicators need to be assessed from the same data flow The Longitudinal tool -- used to assess 1 indicator over 4 reporting periods  The PRISM (Performance of Routine Information System Management) Framework is another tool used for routinely assessing the strength of health information systems. One major difference is that PRISM includes a behavioral assessment component.

Objectives of RDQA Verify  Rapid verification of quality of reported data  Capacity of information systems vs. data quality Implement measures to:  Strengthen data management, reporting systems  Improve data quality Monitor  Performance of data management, reporting systems  Capacity to produce quality data

RDQA Methodology  Data Verification: documentation reviews Accuracy  re-counted vs. reported results Reliability  cross checks  Reporting Performance Timeliness, completeness of reporting, availability of reports  System Assessment Are elements in place to ensure quality reporting?

Data Quality Responsibilities by Level M&E Unit Intermediate Aggregated Site Service Delivery Point Data Quality Responsibilities M&E Unit Provide lower reporting levels with clear guidelines on data collection and reporting Disseminate national policies related to data quality Conduct routine supervisory visits to lower levels Provide organogram of positions/data management responsibilities For RDQAs: Initiate RDQAs in conjunction with other national program units Follow up on late, incomplete, inaccurate, or missing reports Capture all data quality checks not yet captured in an electronic format, including spot/cross check, validations, and updates to error logs Intermediate Aggregate Site Follow appropriate procedures to compile service delivery site forms monthly and send report to the National M&E Ensure budget includes funds for data quality activities Follow up on late, incomplete or missing information Conduct routine supervisory visits to service delivery sites For RDQAs: Initiate RDQAs for service delivery sites Follow up data verification checks as part of supervisory visits Document how discrepancies have been resolved Service Delivery Point Summarize patient data & check data quality of patient registers Submit monthly summary reports to the Health District Routinely analyze & use data to improve quality of care For RDQAs Health Sites do not initiate RDQAs

Steps in Conducting an Assessment  Verify and validate performance information to ensure that data are of reasonable quality  Review data collection and processing procedures to ensure consistent application  Review program capacity and human resources  When data quality issues are identified, take steps to address them develop and implement a budgeted action plan for strengthening the system

Core Components of RDQA Data verifications Quantitative Compares recounted to reported data System Assessment Qualitative Assesses strengths and weaknesses of functional areas of M&E system

Data Verification  Observe or Describe  Connection between the delivery of services/commodities and the completion of the source document that records that service delivery  Review Source Documents  Review availability and completeness of all indicator source documents for the selected reporting period  Verify reported data  Cross-checks  Perform “cross-checks” of the verified report totals with other data- sources  Spot checks  verify the actual delivery of services or commodities to the target populations

System Assessment  M&E Capabilities, Roles and Responsibilities  Training  Data Reporting Requirements  Indicator Definitions  Data-collection and Reporting Forms and Tools  Data Management Processes  Data Quality Mechanisms and Controls

When to Assess  Integrate data quality control mechanisms into standard operating procedures  Integrate data quality checks into routine supervisory visits  Conduct periodic formal assessments Full RDQA vs. Data Verification only  Timeline can be different but this is what we suggest

 Determine purpose for conducting an RDQA  Determine levels and sites  Identify indicators, data sources, reporting periods  Prepare for site visit Notify site visits two weeks in advance

After the RDQA  Review the output of the RDQA  Develop a system strengthening plan, including follow-up actions  Plan on sharing the outcome with the levels and sites that participated in the RDQA

Questions & Answers

MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide.