Indicators, Data Sources and Data Quality for TB M&E

Slides:



Advertisements
Similar presentations
Data Quality Considerations
Advertisements

Data Processing Topic 4 Health Management Information Systems João Carlos de Timóteo Mavimbe Oslo, April 2007.
Integrated Monitoring and Evaluation of HIV Programs in Malawi Dr Andreas Jahn 1,2 1 Dept. for HIV and AIDS, MOH, Malawi 2 I-TECH Malawi.
Monitoring and Evaluation: Tuberculosis Control Programs
Compendium of Indicators for Monitoring and Evaluating National Tuberculosis Programs Using the Compendium to Plan for Monitoring and Evaluation of NTPs.
Comprehensive M&E Systems
Introduction to Monitoring and Evaluation for National TB Programs 20 September 2005.
Compendium of Indicators for Monitoring and Evaluating National Tuberculosis Programs.
Indicators, Data Sources, and Data Quality for TB M&E
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
1 Module OVERVIEW OF EXTERNAL QUALITY ASSESSMENT.
Unit 10. Monitoring and evaluation
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
Calculation of TB patients' Drug Consumption using ENRS Amal Galal M&E, Surveillance officer NTP Egypt Dr. Samiha Baghdadi MO - EMRO WHO.
Unit 3: Universal Case Reporting and Sentinel Surveillance for STIs
Research Seminars in IT in Education (MIT6003) Research Methodology I Dr Jacky Pow.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Vocabulary 1 Research Process. 1. Problem definition: the purpose of the study should be taken into account; the relevant background info; what info is.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
INTRODUCTION TO INFORMATION SYSTEMS FOR IMMUNIZATION SERVICES IPV Global Workshop March 2014.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Session 6: Data Flow, Data Management, and Data Quality.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
Compendium of Indicators for Monitoring and Evaluating National Tuberculosis Programs.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Data Sources, Quality, Management, and HMIS Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May 23-26, 2006.
Introduction to Monitoring and Evaluation of National Tuberculosis Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May 23-26, 2006.
Special Education School Coordinator Monthly Webinar October 12, 2015.
New WHO Guidelines on Person centred monitoring
Introduction to Marketing Research
National Population Commission (NPopC)
Qualitative Research Methodology
Project monitoring and evaluation
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation: A Review of Terms
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Performance Improvement Projects: From Idea to PIP
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Market Research Unit 5 - slide 13.
Data Quality Assurance
11 ii. Develop a plan for aDSM
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems
Evaluating and improving a clinical practice guideline in the Western Cape, South Africa AIM STATEMENT: To design and use an appropriate evaluation tool.
Dissemination Workshop for African countries on the Implementation of International Recommendations for Distributive Trade Statistics May 2008,
PRODUCTION PROCESS AND FLOW
Chapter Three Research Design.
Multi-Sectoral Nutrition Action Planning Training Module
IX- PREPARING THE BUDGET
Perspective Interview: Sofia Perez
GMP Inspection Process
Measuring Data Quality and Compilation of Metadata
Monitoring and Evaluation for TB Programs
Objectives of Session Provide an overview of the development of Compendium Explain the organization of the Compendium and how indicators are used Provide.
Monitoring and Evaluation for TB Programs
Monitoring and Evaluation
Contents Objectives Definition of Terms Policies Procedures
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Data Collection: Designing an Observational System
Comprehensive M&E Systems
Monitoring and Evaluation: A Review of Terms
Monitoring and Evaluation in Communication Management
Data Management for POC EID
Presentation transcript:

Indicators, Data Sources and Data Quality for TB M&E This issue often drives is one of the most important factors influencing which indicators we use. Important to review this topic before doing the indicator selection activity. Data sources differ in every context, country, region so as I go through this presentation please try to think of data sources and data quality issues that your country uses and has trouble with so that we can discuss after the presentation.

Criteria of good indicators Valid: actually measure the phenomenon it is intended to measure Reliable: produce the same results when used more than once to measure precisely the same phenomenon Specific: measure only the phenomenon it is intended to measure Sensitive: reflect changes in the state of the phenomenon under study Operational: measured with developed and tested definitions and reference standards These ideas are probably review and were discussed in ‘how to use the compendium’ Indicators are signs, clues, and markers as to how close we are to our path and how things are changing.

Qualitative vs. Quantitative Qualitative: answer questions about how well the program elements are being carried out. Quantitative: measures how much and how many. Examples of qualitative M&E tools: Sign in logs, Registrations, forms, registers, checklists Program activity forms Patient charts Structured questionnaires Qualitative M&E tools: Focus group Direct observation

Factors in Indicator Selection What national district and local levels need to know Availability of the data Availability of human and financial resources to manage the data Program needs Lender requirements

TB data collection methods and sources Routinely collected data Process monitoring and evaluation Program evaluation/reviews Global TB reporting Special surveys Once you have designed a framework and picked appropriate indicators, you need to develop a data-collection strategy. Of course, you should have been thinking of data availability while developing the framework and indicators, but now you can focus on collection and analysis No single source can satisfy data needs for M&E.

Routine Recording TB Register TB-Treatment Card Laboratory Register Cough register Data collected at TB-treatment facilities and microscopy units, this is the basic unit. To ensure data quality, it is important that: These cards are filled out correctly These units are filled out in the same way What this often requires is: An easy-to-use card (pre-tested) Training Frequent review of how nurses and lab techs are filling them out (supervision and feedback) Really communicating why it is important to have this information filled out correctly and timely

Routine Reporting District TB Register Quarterly report of new cases and relapses of TB Quarterly report on results of treatment of pulmonary-TB patients registered 12-15 months earlier Monthly or quarterly reporting forms sent to Basic Management Unit (BMU), where they are aggregated and then sent to the higher level. Assuming the basic units are filed out correctly, we want to ensure that the “transfer” to the aggregated level is done correctly and in the same fashion Another issue is timeliness Again, this requires supervision, and feedback

Process Monitoring and Evaluation Analysis of recording and reporting Supervision Records of trainings held, meetings held, events, etc…

Program Evaluation/Review Comprehensive review of the entire program Conducted every 2 to 5 years External and internal experts break up into groups and cover a representative sample of the country Usually provides input for developing or revising the medium term development plan

Global Reporting: estimated incidence, case notification, treatment outcomes, some budget information, and coverage. Can use the numbers from WHO to track your own country over time.

Special studies Prevalence survey Population-based survey Facility surveys Vital-registration surveys Tuberculin surveys Drug-resistance surveys read from blurbs on special studies in the intro part of the compendium

Example of a national-level data-collection system Prevalence Survey Prevalence Survey DRS DRS Facility survey Facility survey Facility Survey DRS – Drug Resistance Survey If appropriate, all of these activities should be budgeted for in a GFATM proposal A facility survey could be considered a TB Program Review External-Monitoring Visits Routine information system and surveillance 2000 2002 2004 2006

Why is data quality important? The primary function of health information systems is to provide data that enhance decision-making in the provision of health services. By ensuring high-quality data, the health information-system attempts to guarantee that decision-makers have access to both unbiased and complete information Your Data Source is only as good as the data it produces. While ensuring that the quality of data is good has been addressed in the previous slides, it is important to spend a few moments specifically thinking about this issue of good data. It isn’t just for feeding info into an indicator that needs to be reported elsewhere, it is important to see these as tools for decision-makers at national, district, and local levels to improve their programs.

Standards for good quality data Validity: Do the data clearly and directly measure what was intended to measure? Integrity: Are mechanisms in place to reduce the possibility that data are intentionally manipulated? Precision: Are the data at the appropriate level of detail? Reliability: Would you come to the same finding if the data collection and analysis were repeated? Timeliness: Are data available frequently enough to inform decisions?

Impediments to good data quality Inappropriate data-collection instruments and procedures Poor reporting and recording Errors in processing data (editing, coding, data entry, tabulating)

What can be done to improve and ensure data quality? Keep the design of the information system as simple as possible Involve users in the design of the system Standardize procedures and definitions Pre-test data collection instruments to make sure they are useful and user friendly Ensure that data collected are useful to the data collector Regular supervision and feedback from supervisors Plan for effective checking procedures (such as cross-checking) Training (data-collection instruments, data-processing, analysis, and decision-making based on evidence)

Data-quality assessments Example at district level: Step 1: Interview appropriate individual to obtain understanding of data collection, analysis, and maintenance process Step 2: Review reports to determine whether they are consistent Now that we have discussed basic kinds of data sources and the importance of ensuring data quality, let’s walk through addressing data quality in a step-wise fashion.

Data quality assessments (con’t) Step 3: Periodically sample and review data for completeness, accuracy, and consistency Indicator definitions are consistent with NTP guidelines Data collection is consistent from year to year Data are complete in coverage Formula used to calculate indicator (if any) is applied correctly Step 4: Compare central office records with district or district with facility for consistency and accuracy

Data quality assessments (con’t) Possible data quality limitations Validity: The reported data do not accurately represent the population. For example, records may over-report or under-report certain parts of the population Integrity: The data could be manipulated for a variety of reasons Timeliness: If reporting is not up to date, then decisions not based on the most recent evidence Reliability: Implementation of data collection may be irregular or mistimed

Conclusion needed based on context/region and what the biggest issue is