Multiple Indicator Cluster Surveys Survey Design Workshop

Slides:



Advertisements
Similar presentations
SURVEY QUALITY CONTROL
Advertisements

Multiple Indicator Cluster Surveys MICS3 Regional Training Workshop Survey Techniques.
Multiple Indicator Cluster Surveys MICS3 Regional Training Workshop Survey Logistics.
FIELD STAFF AND TRAINING ISSUES. OBJECTIVES zAdministrative and Logistic Aspects of Training zContent of a Training Course zPreparation for Field Work.
MICS 3 DATA ANALYSIS AND REPORT WRITING. Purpose Provide an overview of the MICS3 process in analyzing data Provide an overview of the preparation of.
Multiple Indicator Cluster Surveys MICS3 Regional Training Workshop Household Information Panel.
MICS4 Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Data Entry Editing.
Multiple Indicator Cluster Surveys Survey Design Workshop
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Field Staff and Field Procedures.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Training of the Field Staff.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop The MICS4 Process.
Module 10 Field work TAS Global Programme to Eliminate Lymphatic Filariasis (GPELF) Training in monitoring and epidemiological assessment of mass drug.
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
1 Field Management: Roles & Responsibilities Partially Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques,
Data Quality Considerations
1 Training Issues Partially adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Field Staff & Training Issues, Unicef.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Chapter Thirteen Fieldwork 13-1 © 2007 Prentice Hall.
Conducting the Interview/Survey
OCR Nationals Level 3 Unit 3. March 2012 M Morison Know the different types of errors that can affect a study Understand why it is necessary to identify.
1 Fieldwork Logistics. OBJECTIVES The importance of logistics in supporting high quality survey results and implementation schedule Key logistical.
1 Zambia Malaria Indicator Survey 2010 “What to do” of the MIS Supervisor.
Copyright 2010, The World Bank Group. All Rights Reserved. Agricultural Data Collection Procedures Section A 1.
Multiple Indicator Cluster Surveys Survey Design Workshop Fieldwork: Survey Logistics and Arrangements MICS Survey Design Workshop.
Conducting Income Survey’s Indiana Office of Community and Rural Affairs “Serving Indiana’s rural communities through technical, financial and personal.
1 Fieldwork logistics and data quality control procedures Kathleen Beegle Workshop 17, Session 2 Designing and Implementing Household Surveys March 31,
INTERVIEWER’S ROLE The interviewer occupies the central importance in NRHM evaluation because he/she collects information from respondents. The success.
1 Fieldwork Roles & Responsibilities Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques, Unicef.
Multiple Indicator Cluster Surveys Data Interpretation, Further Analysis and Dissemination Workshop Overview of Data Quality Issues in MICS.
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
1a Job Descriptions for Personnel Involved in PAT Implementation Materials Developed by The IRIS Center, University of Maryland.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Sampling.
Multiple Indicator Cluster Surveys Survey Design Workshop Preparing for Fieldwork MICS Survey Design Workshop.
Monitoring Requirements Virginia Department of Health Summer Food Service Program (SFSP) 2014.
1 Training Issues Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Field Staff & Training Issues, Unicef.
SAMPLE IMPLEMENTATION PLAN OF A POVERTY ASSESSMENT TOOL To Report to the Management of Microfinance Association of Patharland (MAP), Patharland This proposal.
Arun Srivastava. Types of Non-sampling Errors Specification errors, Coverage errors, Measurement or response errors, Non-response errors and Processing.
Harpreet RIMT-IMCT Chapter Thirteen Fieldwork Harpreet RIMT-IMCT Fieldwork/Data Collection Process Fig Selecting Field WorkersTraining Field.
RELIABILITY AND VALIDITY OF DATA COLLECTION. RELIABILITY OF MEASUREMENT Measurement is reliable when it yields the same values across repeated measures.
Sampling : Error and bias. Sampling definitions  Sampling universe  Sampling frame  Sampling unit  Basic sampling unit or elementary unit  Sampling.
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Interpreting Field Check Tables.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
1 Staff Training and Quality Assurance Staff training that is RELEVANT, and given at the appropriate TIMING for the activity that is being planned will.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
WHO ARE KENYANS PREPARED TO VOTE FOR AS OF NOW? SPEC BAROMETER RESULTS November 9th 2007.
Multiple Indicator Cluster Surveys Data Processing Workshop CAPI Supervisor’s Menu System MICS Data Processing Workshop.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Maria Isabel Beltran 1.
Active Parasite Detection 2011 Supplemental Enumerator and CHW Training 21 November, 2011.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Copyright 2010, The World Bank Group. All Rights Reserved. Managing Data Collection Section A 1.
Multiple Indicator Cluster Surveys Regional Training Workshop I – Survey Design General Characteristics of MICS3 Questionnaires.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Entry Using Tablets / Laptops.
Data Collection: Enhancing Response Rates & Limiting Errors Chapter 10.
1 Quality Control for Field Operations. 2 Overview Goal To ensure the quality of survey field work Purpose To detect and deter interviewer errors and.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
Sampling Design and Analysis MTH 494 Ossam Chohan Assistant Professor CIIT Abbottabad.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Entry Using Tablets / Laptops.
Multiple Indicator Cluster Surveys Data Processing Workshop Overview of SPSS structural check programs and frequencies MICS Data Processing Workshop.
Preparing for Fieldwork Food and Nutrition Security Survey.
Session 6: Data Flow, Data Management, and Data Quality.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
2015 Afghanistan Demographic and Health Survey (AfDHS) Key Indicators Report.
Ethiopia Demographic and Health Survey 2011 Introduction and Methodology.
Module 3: Selecting Locations and Respondents Outcome Monitoring and Evaluation Using LQAS.
Multiple Indicator Cluster Surveys Survey Design Workshop
International Standards and Contemporary Technologies,
Presentation transcript:

Multiple Indicator Cluster Surveys Survey Design Workshop Fieldwork: Survey Quality Control MICS Survey Design Workshop

Objectives Identify factors affecting the accuracy and reliability of survey data How to prevent and correct errors The essential role of supervision in the field and providing feedback to the team

Accuracy and reliability The accuracy (or validity) of a measurement is concerned with the net difference between the mean of the measurements obtained and the true value (related to the size of the bias) The reliability (or precision) of a measurement refers to the degree to which repeated measurements give consistent values (related to the size of the confidence interval)

Accuracy and reliability Unreliable but accurate imprecise but unbiased Unreliable & inaccurate imprecise & biased Reliable but inaccurate precise but biased Reliable & accurate precise & unbiased  BIASED

Overview of presentation Data collection Field supervision

Data Collection Organisation of daily work, Security of staff and equipment

Data Collection: Implementation of sample Non-response: failure to obtain information for selected households, eligible women, or children A potentially serious bias that can be minimized Interviewers will need to make return visits to households (call-backs)

Data Collection: Implementation of sample Types of non-response: Interviewer is unable to do the selected household Interviewer unable to meet eligible respondents Respondent refuses to be interviewed

Data Collection: Implement. of sample Types of non-response: Interviewer is unable to do the selected household.  Why? Structure not found Occupied structure inaccessible Structure non-residential, vacant, or demolished

Data Collection: Implement. of sample Types of non-response: Interviewer unable to meet eligible respondents.  Why? No one at home at time of call Respondent temporarily absent Will need call-backs

Call-backs Interview that is not completed requires a “call-back” or follow-up visit Three call-backs required (different times and days) Supervisors and interviewers keep track using control sheets Requires good tracking of work to ensure that all planned interviews are completed before leaving cluster

Data Collection: Implement. of sample Types of non-response: Respondent refuses to be interviewed. Why?  What to do? Approach respondent from her point of view Postponed interview to another day Have supervisor/field editor revisit the respondent The number of refusals should be closely monitored. Need to find out reasons for frequent refusals.

Data Collection: Monitoring field work Training is a continuous process Observation and supervision throughout the fieldwork are a part of the training Team supervisors and field editors play very important roles in continuing this training and in ensuring the quality of MICS data

Data Collection: Monitoring field work Fieldwork control sheets Direct observation of interviews Review of completed questionnaires (editing) Spot-checking households and HH composition

Systematic Spot Checking How to spot-check household composition? Supervisors should complete columns 2-6 in the HH questionnaire and compare with that of the interviewer Check about 5% of households (5-6 per week) All team members must be spot-checked; provide feedback if necessary

Fieldwork Control Sheets Interviewers’ work is monitored and evaluated by keeping accurate record of assignments Both supervisors and interviewers have control forms to maintain These forms should be returned to the director of field operations along with the completed questionnaires Interviewer is responsible for ensuring that control sheet is up-to-date

Observing Interviews, part 1 To evaluate and improve interviewer performance To look for errors and misconceptions that cannot be detected through editing Why? Precise but inaccurate answers Who observes? The supervisors or field editors Who should be observed? Every interviewer To check if interviewer is editing h/er own work before leaving HH

Observing Interviews, part 2 How often? 5-6 interviews per week, more at start of fieldwork How? Just take notes without disrupting the interview What to do after? Supervisors reviews questionnaires with interviewer, highlight issues and proposes solution and training

Editing Questionnaires in the Field Fundamental to survey quality Need to ensure: Accurate and complete information in each questionnaire Correct count of questionnaires

Editing Questionnaires in the Field Done daily by field editor; supervisor can assist Editing of all questionnaires must be completed BEFORE leaving cluster Results and errors are discussed with interviewers Interviewers should go back to HH for correction

Evaluating Interviewers Performance Daily discuss the quality of interviewers work Point out mistakes discovered during observation of interviews or noticed during editing Discuss examples of actual mistakes, but be careful not to embarrass individual interviewers. Re-read relevant sections from the Interviewer's Manual with the team to resolve problems. Encourage the interviewers to talk about any situations they encountered in the field Discuss whether situations are handled properly, and how to do it in the future

Overview of presentation Data collection Field supervision

Field supervision Who should go to Field supervision? Supervision team, Senior staff from Implementing agency, stakeholders, UNICEF staff (except if strategic or political, avoid “supervision tourism”) ToR for the supervision team (around Quality control) with report after each mission Very important to put in place within the first week of fieldwork (if possible, start fieldwork in one central location)

Field Supervision Why Bring equipment, money, questionnaires … Visit teams to observe interviews, review work Re-visit selected clusters; spot-check households Bring back completed and edited questionnaires to central location for data entry

Field Supervision: Field Check Tables (FCT) FCT are an essential tool for field supervision Based on already entered questionnaires brought back from fieldwork by supervision missions Generated by data entry teams on a weekly basis Provide a full range of information about the quality of the data already collected Provide information on the work of each team and each interviewer To be shared on a regular basis with RO and HQ Survey Coordinators: Don’t go out without them

Field Check Tables FC-2W: Eligible women per household Mean number of eligible women per household, according to interviewer team Team Urban Rural Number of completed households Number of eligible women in those HHs Mean number of eligible women per HH Target not met   Team 1 46 65 1.41 86 73 0.85 Team 2 172 243 214 225 1.05 Team 3 139 158 1.14 82 119 1.45 Team 4 197 236 1.20 120 112 0.93 Team 5 116 131 1.13 161 190 1.18 Total 670 833 1.24 663 719 1.08 Note: Number of women that are expected to be found per HH is country-specific and defined in the sample design (it usually differs by urban/rural areas). The target is the minimum mean number of de facto eligible women per HH that we hope to find, and should be > 80% of what was expected at the time of sample design. Thus, if we expected to find 1.2 women per HH at the time of sample design, teams should be finding a minimum of 0.96 women per HH. Survey managers should provide data processors with the country-specific targets. Example: MICS sample was drawn based on the expectation of finding 1.46 women per HH in urban areas and 1.24 women per HH in rural areas. Targets for this table are for teams to find at least 1.17 women per HH (80% of 1.46) in urban areas and 0.99 women per HH (80% of 1.24) in rural areas.

Age ratio (women 15/ women 14) Field Check Tables FC-4W: Age displacement: women Number of women age 12-18 years listed in the household schedule by single year of age and age ratio 15/14, according to interviewer team Team Women's age (12 - 18 years) TOTAL Age ratio (women 15/ women 14) Target not met 12 13 14 15 16 17 18   Team 1 10 11 8 7 9 64 0.73 Team 2 68 0.78 Team 3 79 1.18 - Team 4 5 6 67 0.38 Team 5 74 Total 59 62 56 46 44 40 352 0.83 Note: Target is an age ratio of women age 15 / women age 14 > 0.8

FCT : What else ? USE THEM!