Multiple Indicator Cluster Surveys Survey Design Workshop MICS Evaluations and Lessons Learned from MICS4 MICS Survey Design Workshop
Part 1: MICS Evaluations
MICS Evaluations MICS1 Evaluation MICS2 – No evaluation MICS3 Evaluation – John Snow Inc Comparable quality with DHS and other survey programs Fulfills important mission in global monitoring Mismatch between where technical expertise is (HQ) and where technical decisions are taken (Country), communication problems Short-cuts are being taken in training, listing, fieldwork Limited human resources an impediment
MICS Evaluations MICS4 Evaluation, Cluster 1 and Cluster 2 Cluster 1 completed
MICS4 Evaluation - Findings Significant expansion of the envelope of technical support resources: Regional coordinators, support by experts, UNICEF MICS Consultants, more structured process of technical support and quality assurance Organizational structure, communication channels, decision-making authorities remain unchanged – suboptimal for the objectives. E.g. CO not complying with guidelines, quality assurance processes (large samples, additional questions) Not on the agenda of senior managers at HQ or RO levels
MICS4 Evaluation - Findings Universal adherence to training guidelines (duration) No evidence of interviews or spot-checks Field check tables an important tool, inconsistent use Large sample sizes, large survey teams greater than recommended, manageable levels Shorter time for production of final reports
MICS4 Evaluation - Findings Dramatic improvement in data quality MICS4 and DHS have comparable quality on most indicators Quality of some MICS data need improvement
MICS4 Evaluation - Recommendations CO to be compelled to hire UMCs Increase regional consultant pool Fully integrate technical review and quality assurance processes into key documents When MICS reports are lagging, additional implementing agency or consultant to finalize report – include in MoU UNICEF should invest more into other data collection efforts, without hampering MICS or other household surveys, for lower administrative level data generation
MICS4 Evaluation - Recommendations Additional data processing staff needed Strengthen use of field check tables Increase guidance to Ros to gauge risks in advance of MoUs, and for course-correction and withdrawal from global MICS program Do’s and don’t’s for CO and RO managers Tools to be developed to ensure consistency of the work of regional consultants Documentation for sample design and implementation
MICS4 Evaluation - Recommendations Spot checks and observations Measurements for further improvement of anthropometric data quality Better documentation of Global MICS Consultations Regional coordinator turn-over – overlaps needed
Part 2: Data Quality
Looking at data quality – Why? Confidence in survey results Identify limitations in results Inform dissemination and policy formulation All surveys are subject to errors
Data quality Two types of errors in surveys Sampling errors Non-sampling errors: All other types of errors, due to any stage of the survey process other than the sample design All survey stages are interconnected and play roles in non-sampling errors
Data quality Sampling errors can be envisaged before data collection, and measured after data collection More difficult to control and/or identify non-sampling errors
Data quality We have discussed several features/recommendations for quality assurance to minimize non-sampling errors Failure to comply with principles behind these recommendations leads to problems in data quality
Data quality analyses Looking at Departures from recommended procedures/protocols Internal consistency Completeness
Before we begin
Monitoring Priorities Standard Survey Instruments Countries, UNICEF, Interagency Groups, Partners in Development Monitoring Priorities Goals And Targets Indicators Operationalization Validation, Testing, Piloting, National Surveys Standard Survey Instruments Questionnaires, Data Processing Tools, Sampling Considerations, Analysis Plans, Dissemination Strategies
Monitoring Priorities Goals And Targets Indicators Major source of poor data quality Operationalization Non-validated, untested Survey Instruments Standard Survey Instruments
Completion, Age, Completeness
Household Completion Rates Completed / Selected
Household Response Rates
Women’s Response Rates
Under-5 Response Rates
Age Distributions
Age Distributions
Age Distributions
Age Distributions
Age Distributions
Age Distributions
Age Distributions
Age Distributions
Women – Complete Birth Date
Under-5s – Complete Birth Date
Observations Selection MICS Protocols Observations Selection
Selection for Child Discipline
Refusals to Observe Place for Handwashing
Observing Documents
Out-transference Omission Serious Business Out-transference Omission
Years Since Last Birth
Years Since Last Birth
Years Since Last Birth
Years Since Last Birth
Ratio of children age 2 to age 1
Ratio of Population Age 2-3 to Age 0-1
Ratio of Population Age 5 to 4
Out-transference from age 15
Out-transference from age 15 (non-MICS)
Out-transference from age 15
Out-transference from age 15
Out-transference from age 15
Out-transference from age 15
Out-transference from age 15
Out-Transference/Omission
Easy Target Anthropometry
Anthropometry Digit Heaping
Digit Recorded for Weight
Digit Recorded for Height
Digit Recorded for Weight
Digit Recorded for Height
Digit Recorded for Height
Digit Recorded for Height
Digit Recorded for Height
MENA Surveys
MENA Surveys
Operations
Summary Comply with the principles behind standard protocols Think of the implications of each action on other stage of implementation, and data quality Always check for errors/issues “Understand” your data Be transparent and report on problems ..before others detect them