Multiple Indicator Cluster Surveys Survey Design Workshop

Slides:



Advertisements
Similar presentations
Elements of Survey Methodology Documentation MICS3 Data Analysis and Report Writing Workshop.
Advertisements

Multiple Indicator Cluster Surveys Data Entry and Processing.
Multiple Indicator Cluster Surveys Regional Training Workshop I - Survey Design Objectives of the Workshop.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Entry and Processing.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Objectives of the Workshop.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop MICS4 Technical Assistance.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop The MICS3 Evaluation.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop The MICS4 Process.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop General Structure of a Survey Plan.
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
1 Field Management: Roles & Responsibilities Partially Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques,
Data Quality Considerations
1 Training Issues Partially adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Field Staff & Training Issues, Unicef.
The state of the art: DHS and MICS
Multiple Indicator Cluster Surveys Survey Design Workshop
1 Fieldwork Roles & Responsibilities Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques, Unicef.
Multiple Indicator Cluster Surveys Data Interpretation, Further Analysis and Dissemination Workshop Overview of Data Quality Issues in MICS.
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Data Quality Tables.
Multiple Indicator Cluster Surveys Survey Design Workshop Data Analysis and Reporting MICS Survey Design Workshop.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Overview of the MICS Process.
1 Training Issues Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Field Staff & Training Issues, Unicef.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Interpreting Field Check Tables.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Analysis and Reporting.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
Report on the Evaluation Function Evaluation Office.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Maria Isabel Beltran 1.
European Conference on Quality in Official Statistics Session 26: Quality Issues in Census « Rome, 10 July 2008 « Quality Assurance and Control Programme.
UNICEF and Statistical Capacity Building UN Statistical Commission 1 March 2007 Tessa Wardlaw.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Organized by UNICEF ___RO and HQ Day, Month, Year.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
Knowledge Practice and Coverage (KPC) Overview October 3, 2012.
Understanding DWCPs, tripartite process and role of Trade Unions How the ILO works at a national level.
Global Youth Tobacco Survey (GYTS): Overview
Inclusive Education for Children with Disabilities
Follow along on Twitter!
Short Training Course on Agricultural Cost of Production Statistics
WG Collaborations and other related activities
Quality assurance in population and housing census SUDAN’s EXPERIANCE in QUALITY assurance of Censuses By salah El din. A . Magid OUR EXPERIANCE IN 5.
Developing reporting system for SDG and Agenda 2063, contribution of National Statistical System, issues faced and challenges CSA Ethiopia.
Herman Smith United Nations Statistics Division
System Planning To Programming
WHO The World Health Survey General Introduction
African Centre for Statistics
The International Plant Protection Convention
Data Collection Mechanisms for ESD
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
UNECE Work Session on Gender Statistics, Belgrade,
United Nations Statistics Division DESA, New York
Presented by Richard Laing
United Nations Statistics Division DESA, New York
QUALITY DEVELOPMENT IN COLOMBIA AND LATIN AMERICAN
Statistics Governance and Quality Assurance: the Experience of FAO
Institutionalising World Vision’s Accountability to Communities
Objective of the workshop
MICS SDGs: Baselines for children and new methodological work - TransMonEE meeting, Athens, Oct
Understanding DWCPs, tripartite process and role of Trade Unions
Data collection.
The FSDT Tanzania Experience
Strengthening the Role of EQAVET National Reference Points
Understanding DWCPs, tripartite process and role of Trade Unions
Data collection.
GSBPM AND ISO AS QUALITY MANAGEMENT SYSTEM TOOLS: AZERBAIJAN EXPERIENCE Yusif Yusifov, Deputy Chairman of the State Statistical Committee of the Republic.
Government of National Unity & Government of Southern Sudan
Draft revision of ISPM 6: National surveillance systems ( )
Training module on anthropometric data quality
Presentation transcript:

Multiple Indicator Cluster Surveys Survey Design Workshop MICS Evaluations and Lessons Learned from MICS4 MICS Survey Design Workshop

Part 1: MICS Evaluations

MICS Evaluations MICS1 Evaluation MICS2 – No evaluation MICS3 Evaluation – John Snow Inc Comparable quality with DHS and other survey programs Fulfills important mission in global monitoring Mismatch between where technical expertise is (HQ) and where technical decisions are taken (Country), communication problems Short-cuts are being taken in training, listing, fieldwork Limited human resources an impediment

MICS Evaluations MICS4 Evaluation, Cluster 1 and Cluster 2 Cluster 1 completed

MICS4 Evaluation - Findings Significant expansion of the envelope of technical support resources: Regional coordinators, support by experts, UNICEF MICS Consultants, more structured process of technical support and quality assurance Organizational structure, communication channels, decision-making authorities remain unchanged – suboptimal for the objectives. E.g. CO not complying with guidelines, quality assurance processes (large samples, additional questions) Not on the agenda of senior managers at HQ or RO levels

MICS4 Evaluation - Findings Universal adherence to training guidelines (duration) No evidence of interviews or spot-checks Field check tables an important tool, inconsistent use Large sample sizes, large survey teams greater than recommended, manageable levels Shorter time for production of final reports

MICS4 Evaluation - Findings Dramatic improvement in data quality MICS4 and DHS have comparable quality on most indicators Quality of some MICS data need improvement

MICS4 Evaluation - Recommendations CO to be compelled to hire UMCs Increase regional consultant pool Fully integrate technical review and quality assurance processes into key documents When MICS reports are lagging, additional implementing agency or consultant to finalize report – include in MoU UNICEF should invest more into other data collection efforts, without hampering MICS or other household surveys, for lower administrative level data generation

MICS4 Evaluation - Recommendations Additional data processing staff needed Strengthen use of field check tables Increase guidance to Ros to gauge risks in advance of MoUs, and for course-correction and withdrawal from global MICS program Do’s and don’t’s for CO and RO managers Tools to be developed to ensure consistency of the work of regional consultants Documentation for sample design and implementation

MICS4 Evaluation - Recommendations Spot checks and observations Measurements for further improvement of anthropometric data quality Better documentation of Global MICS Consultations Regional coordinator turn-over – overlaps needed

Part 2: Data Quality

Looking at data quality – Why? Confidence in survey results Identify limitations in results Inform dissemination and policy formulation All surveys are subject to errors

Data quality Two types of errors in surveys Sampling errors Non-sampling errors: All other types of errors, due to any stage of the survey process other than the sample design All survey stages are interconnected and play roles in non-sampling errors

Data quality Sampling errors can be envisaged before data collection, and measured after data collection More difficult to control and/or identify non-sampling errors

Data quality We have discussed several features/recommendations for quality assurance to minimize non-sampling errors Failure to comply with principles behind these recommendations leads to problems in data quality

Data quality analyses Looking at Departures from recommended procedures/protocols Internal consistency Completeness

Before we begin

Monitoring Priorities Standard Survey Instruments Countries, UNICEF, Interagency Groups, Partners in Development Monitoring Priorities Goals And Targets Indicators Operationalization Validation, Testing, Piloting, National Surveys Standard Survey Instruments Questionnaires, Data Processing Tools, Sampling Considerations, Analysis Plans, Dissemination Strategies

Monitoring Priorities Goals And Targets Indicators Major source of poor data quality Operationalization Non-validated, untested Survey Instruments Standard Survey Instruments

Completion, Age, Completeness

Household Completion Rates Completed / Selected

Household Response Rates

Women’s Response Rates

Under-5 Response Rates

Age Distributions

Age Distributions

Age Distributions

Age Distributions

Age Distributions

Age Distributions

Age Distributions

Age Distributions

Women – Complete Birth Date

Under-5s – Complete Birth Date

Observations Selection MICS Protocols Observations Selection

Selection for Child Discipline

Refusals to Observe Place for Handwashing

Observing Documents

Out-transference Omission Serious Business Out-transference Omission

Years Since Last Birth

Years Since Last Birth

Years Since Last Birth

Years Since Last Birth

Ratio of children age 2 to age 1

Ratio of Population Age 2-3 to Age 0-1

Ratio of Population Age 5 to 4

Out-transference from age 15

Out-transference from age 15 (non-MICS)

Out-transference from age 15

Out-transference from age 15

Out-transference from age 15

Out-transference from age 15

Out-transference from age 15

Out-Transference/Omission

Easy Target Anthropometry

Anthropometry Digit Heaping

Digit Recorded for Weight

Digit Recorded for Height

Digit Recorded for Weight

Digit Recorded for Height

Digit Recorded for Height

Digit Recorded for Height

Digit Recorded for Height

MENA Surveys

MENA Surveys

Operations

Summary Comply with the principles behind standard protocols Think of the implications of each action on other stage of implementation, and data quality Always check for errors/issues “Understand” your data Be transparent and report on problems ..before others detect them