Data Quality Considerations

Slides:



Advertisements
Similar presentations
Basic Principles of GMP
Advertisements

MONITORING OF SUBGRANTEES
SURVEY QUALITY CONTROL
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
Data Quality Assurance and Dissemination International Workshop on Energy Statistics Aguascalientes, Mexico.
1 Welcome Safety Regulatory Function Handbook April 2006.
The Managing Authority –Keystone of the Control System
“Train the trainers” seminar
European Union Cohesion Policy
The European Organisation for the Safety of Air Navigation Implementing DQR - A Practical View DAL/DQR Workshop Brussels, February 2013 Presented.
GOALS FOR TODAY Understand how to write a HACCP Plan
EMS Checklist (ISO model)
Quality Assurance/Quality Control Plan Evaluation February 16, 2005.
Evaluating administrative and institutional capacity building
The New GMP Annex 11 and Chapter 4 Deadline for coming into operation: 30 June 2011.
Holistic Rating Training Requirements Texas Education Agency Student Assessment Division.
Quality Management within the Clinical Research Process
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Multiple Indicator Cluster Surveys Survey Design Workshop
Internal verification and external standards moderation.
Coping with Electronic Records Setting Standards for Private Sector E-records Retention.
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
Indicators, Data Sources, and Data Quality for TB M&E
Quality evaluation and improvement for Internal Audit
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Implementing the Data Quality Assessment Tool
4/3/20011 Ethics in Special Education Assessment and Testing and Maintenance of Student Information.
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
Confidentiality and Security Issues in ART & MTCT Clinical Monitoring Systems Meade Morgan and Xen Santas Informatics Team Surveillance and Infrastructure.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
TOPS Gender Task Force M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Kristi Tabaj, TOPS/Save the Children.
Chapter 3 資訊安全管理系統. 4.1 General Requirements Develop, implement, maintain and continually improve a documented ISMS Process based on PDCA.
DATA QUALITY How closely do the data used reflect the truth about results?
Copyright 2010, The World Bank Group. All Rights Reserved. Training the Enumerators and Collection of Data Part II.
Copyright 2010, The World Bank Group. All Rights Reserved. Part 2 Labor Market Information Produced in Collaboration between World Bank Institute and the.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Data Quality Assessment
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Use of Administrative Data Seminar on Developing a Programme on Integrated Statistics in support of the Implementation of the SNA for CARICOM countries.
SMS Planning.  Safety management addresses all of the operational activities of the entire organization.  The four (4) components of an SMS are: 1)
LIBERIA DATA QUALITY How closely do the data used reflect the truth about results?
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Sources of Errors M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
SCHOOLS FINANCE OFFICERS MEETINGS Records Management, “Paper-Lite” Environments and Procedures when a school closes Elizabeth Barber.
1 Banking and Reconciliation. 2 To Certify As A Cash Handler  Visit the training website  Review the Payment Card Industry (PCI)
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
Linking Data with Action Part 2: Understanding Data Discrepancies.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
PMRM Revision Discussion Slides Illustrations/Figures 1-3 o Model, Methodology, “Scope” options Functions, Mechanisms and “Solutions” Accountability and.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 1 Quality management Produced in Collaboration between.
Session 6: Data Flow, Data Management, and Data Quality.
Data Quality. Learning Objectives 1.Identify data quality issues at each step of a data management system 2.List key criteria used to assess data quality.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
 Pharmaceutical Care is a patient-centered, outcomes oriented pharmacy practice that requires the pharmacist to work in concert with the patient and.
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Accountability & Structured Privacy Management
Ensuring Data Quality for Monitoring and Evaluation
Data Quality Assurance Workshop
Data Quality Assurance
Quality Control SOP 3.12 Release Date: 08/10/2015.
Measuring Data Quality and Compilation of Metadata
Conceptual Introduction to the RDQA
Operationalizing Export Certification and Regionalization Programmes
Measuring Data Quality
Integrating Gender M&E Capacity Strengthening Workshop, Addis Ababa
Presentation transcript:

Data Quality Considerations M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011 Arif Rashid, TOPS

Data Quality ? Project Implementation Data Management System Project activities are implemented in the field. These activities are designed to produce results that are quantifiable. Data Management System An information system represents these activities by collecting the results that were produced and mapping them to a recording system. Data Quality: How well the DMS represents the fact ? True picture of the field Data Management System Slide # 1

Why Data Quality? Program is “evidence-based” Data quality  Data use Accountability Slide # 2

Conceptual Framework of Data Quality? Dimensions of Data Quality Accuracy, Completeness, Reliability, Timeliness, Confidentiality, Precision, Integrity Quality Data Data management and reporting system M&E Unit in the Country Office Functional components of Data Management Systems Needed to Ensure Data Quality M&E Structures, Roles and Responsibilities Indicator definitions and reporting guidelines Data collection and reporting forms/tools Data management processes Data quality mechanisms M&E capacity and system feedback Intermediate aggregation levels (e.g. districts/ regions, etc.) Service delivery points Slide # 3

Dimensions of data quality Accuracy/Validity Accurate data are considered correct. Accurate data minimize error (e.g., recording or interviewer bias, transcription error, sampling error) to a point of being negligible. Reliability Data generated by a project’s information system are based on protocols and procedures. The data are objectively verifiable. The data are reliable because they are measured and collected consistently. Slide # 4

Dimensions of data quality Precision The data have sufficient detail information. For example, an indicator requires the number of individuals who received training on integrated pest management by sex. An information system lacks precision if it is not designed to record the sex of the individual who received training. Completeness Completeness means that an information system from which the results are derived is appropriately inclusive: it represents the complete list of eligible persons or units and not just a fraction of the list. Slide # 5

Dimensions of data quality Timeliness Data are timely when they are up-to-date (current), and when the information is available on time. Integrity Data have integrity when the system used to generate them are protected from deliberate bias or manipulation for political or personal reasons. Slide # 6

Dimensions of data quality Confidentiality Confidentiality means that the respondents are assured that their data will be maintained according to national and/or international standards for data. This means that personal data are not disclosed inappropriately, and that data in hard copy and electronic form are treated with appropriate levels of security (e.g. kept in locked cabinets and in password protected files. Slide # 7

Data quality Assessments Project participants Managers Technicians Field staff Local Govt. Partners Headquarters Slide # 8

Data quality Assessments Two dimensions of assessments: Assessment of data management and reporting systems Follow-up verification of reported data for key indicators (spot checks of actual figures) Slide # 9

Systems assessment tools M&E structures, functions and capabilities 1 Are key M&E and data-management staff identified with clearly assigned responsibilities? 2 Have the majority of key M&E and data management staff received the required training? Indicator definitions and reporting guidelines 3 Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points? 4 Has the project clearly documented what is reported to who, and how and when reporting is required? Data collection and reporting forms/tools 5 Are there standard data-collection and reporting forms that are systematically used? 6 Are data recorded with sufficient precision/detail to measure relevant indicators? 7 Are source documents kept and made available in accordance with a written policy? Slide # 10

Systems assessment tools Data management processes Does clear documentation of collection, aggregation and manipulation steps exist? Are data quality challenges identified and are mechanisms in place for addressing them? Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to periodically verify source data? M&E capacity and system feedback Do M&E staff have clear understanding about the roles and how data collection and analysis fits into the overall program quality? Do M&E staff have clear understanding with the PMP, IPTT and M&E Plan? Do M&E staff have required skills in data collection, aggregation, analysis, interpretation and reporting ? Are there clearly defined feedback mechanism to improve data and system quality? Slide # 11

Schematic of follow-up verification Slide # 12

M&E system design for data quality Appropriate design of M&E system is necessary to comply with both aspects of DQA Ensure that all dimensions of data quality are incorporated into M&E design Ensure that all processes and data management operations are implemented and fully documented (ensure a comprehensive paper trail to facilitate follow-up verification) Slide # 13

This presentation was made possible by the generous support of the American people through the United States Agency for International Development (USAID). The contents are the responsibility of Save the Children and do not necessarily reflect the views of USAID or the United States Government.