“... providing timely, accurate, and useful statistics in service to U.S. agriculture.” Using Mixed Methods to Evaluate Survey Questionnaires Heather Ridolfo,

Slides:



Advertisements
Similar presentations
International Conference on Establishment Surveys Montreal, 2007 Richard Rosen, U.S. Bureau of Labor Statistics
Advertisements

Improved Questionnaire Design Yields Better Data: Experiences from the UKs Annual Survey of Hours and Earnings Jacqui Jones, Pete Brodie, Sarah Williams.
Telephone Interviews Case Study: Restoring Rivers One Reach at a Time: Results from a Survey of U.S. River Restoration Practitioners Boris Bruk February.
The Use of Automated Telephone Reminders as an Alternative to Postcard Reminders in Survey Data Collection United States Department of Agriculture National.
Do Economic and Demographic Characteristics Differ between Web and Mail Respondents to the 2005 Census of Agriculture Content Test? By Nancy J. Dickey.
Copyright 2010, The World Bank Group. All Rights Reserved. Agricultural Data Collection Procedures Section A 1.
STATISTICS FOR MANAGERS LECTURE 2: SURVEY DESIGN.
COLLECTING DATA ON A SAMPLE OF RESPONDENTS Designing survey instruments.
RESEARCH METHODS Lecture 24
Survey Design Steps in Conducting a survey.  There are two basic steps for conducting a survey  Design and Planning  Data Collection.
Consumer Expenditure Survey Redesign Jennifer Edgar Bureau of Labor Statistics COPAFS Quarterly Meeting March 4, 2011.
Chapter 13 Survey Designs
Questionnaire Designing Developing the best instrument to collect data for your research.
Introduction of Internet Survey Methodology to the Emirate of Abu Dhabi Andrew Ward, Maitha Al Junaibi and Dragica Sarich.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
1a Job Descriptions for Personnel Involved in PAT Implementation Materials Developed by The IRIS Center, University of Maryland.
STATISTICS CANADA SURVEY LIFECYCLE WOLFVILLE, APRIL 2008 SURVEY LIFECYCLE Michel B. Séguin Atlantic DLI Training.
Like – But Oh, How Different: The effect of different questionnaire formats in the 2005 Census of Agriculture Content Test. Jaki S. McCarthy National Agricultural.
© 2015 albert-learning.com Marketing Research Process MARKETING RESEARCH PROCESS.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
(Source: Causeweb.org). Elementary Survey Sampling 7 th Edition By Scheaffer, Mendenhall, Ott and Gerow.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Improving the Design of UK Business Surveys Gareth James Methodology Directorate UK Office for National Statistics.
Designing and Using a Behaviour code frame to assess multiple styles of survey items Alice McGee and Michelle Gray.
Software Systems for Survey and Census Yudi Agusta Statistics Indonesia (Chief of IT Division Regional Statistics Office of Bali Province) Joint Meeting.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
Mixed mode in the data collection of SBS statistics within Statistics Sweden Cecilia Hertzman Seminar om Statistical Data Collection, Geneva
Research Methodology Lecture No : 21 Data Preparation and Data Entry.
Interface Design and Testing for Electronic Self Administered Survey Forms using Excel Authors: Emma Farrell, Kettie Hewett, Tracey Rowley, Leone Van Ede.
Copyright 2010, The World Bank Group. All Rights Reserved. Sources of Agricultural Data Section A 1.
Time Use Survey Coding and Processing Time Use Data.
Copyright 2010, The World Bank Group. All Rights Reserved. Managing Data Collection Section A 1.
The Challenge of Non- Response in Surveys. The Overall Response Rate The number of complete interviews divided by the number of eligible units in the.
Post enumeration survey in the 2009 Pilot Census of Population, Households and Dwellings in Serbia Olga Melovski Trpinac.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Essex Dependent Interviewing Workshop 17/09/2004 British Household Panel Survey.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
Identifying Sources of Error: the 2007 Classification Error Survey for the US Census of Agriculture Jaki McCarthy and Denise Abreu USDA’s National Agricultural.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Survey Methodology Survey Instruments (1) EPID 626 Lecture 7.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
United States Department of Agriculture Natural Resources Conservation Service Technical Soil Services Activity Reporting Michael Robotham National Leader.
Unit 2 AS Sociology Research Methods Examination Technique.
Mid-Decade Assessment of the United Nations 2010 World Population and Housing Census Program Arona L. Pistiner Office of the Associate Director for 2020.
National Boot camp Vancouver Heather Dryburgh and Michel B. Séguin May 31 st, 2011 Survey Life cycle.
Sampling Design and Analysis MTH 494 Ossam Chohan Assistant Professor CIIT Abbottabad.
Benjamin Sila Samoa Bureau of Statistics ‘Data Collection on rural enterprises & services through Ag Census’ FAO Roundtable Meeting on Programme for the.
1 Cognitive Aspects Associated with Sample Selection Conducted by Respondents in Establishment Surveys La Toya Barnett Thomas Rebecca L. Morrison Grace.
Copyright 2010, The World Bank Group. All Rights Reserved. DESIGN, PART 2 Questionnaire Design Quality assurance for census 1.
Research Methodology II Term review. Theoretical framework  What is meant by a theory? It is a set of interrelated constructs, definitions and propositions.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
First meeting of the Technical Cooperation Group for the Population and Housing Censuses in South East Europe Vienna, March 2010 POST-ENUMERATION.
Managing Multi Mode Collection Instruments in the 2011 UK Census Frank Nolan, Heather Wagstaff, Ruth Wallis Office for National Statistics UK.
United Nations Workshop on Revision 3 of Principles and recommendations for Population and Housing Censuses and Census Evaluation Amman, Jordan, 19 – 23.
1. 2 Issues in the Design and Testing of Business Survey Questionnaires: Diane K. Willimack U.S. Census Bureau Economic Census The International.
Survey Training Pack Session 3 – Questionnaire Design.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
How to deal with quality aspects in estimating national results Annalisa Pallotti Short Term Expert Asa 3st Joint Workshop on Pesticides Indicators Valletta.
Component D: Activity D.3: Surveys Department EU Twinning Project.
Division of HIV/AIDS Managing Questionnaire Development for a National HIV Surveillance Survey, Medical Monitoring Project Jennifer L Fagan, Health Scientist/Interview.
The Effects of Prelisted Items in Business Survey Questionnaire Tables
Heather Ridolfo, Virginia Harris and Emilola Abayomi
IFSP Aligned with the Early Intervention Data System
Select Items with High Burden
Statistics and Research Desgin
Survey Design Steps in Conducting a survey
Dependent interviewing in the Swedish LFS
Business Research Methods
Software Systems for Survey and Census
Presentation transcript:

“... providing timely, accurate, and useful statistics in service to U.S. agriculture.” Using Mixed Methods to Evaluate Survey Questionnaires Heather Ridolfo, Kathy Ott, Michael Gerling, Chris Marokov, Herschel Sanders, & Andrew Dau National Agricultural Statistics Service

FedCASIC March 19, 2014 Questionnaire Evaluation at NASS Limited questionnaire evaluation Quantitative methods Expert review Cognitive interviewing o Very few studies, most using small samples 2

FedCASIC March 19, 2014 Improving Questionnaire Evaluation at NASS Increase input from respondents Expanding cognitive interview program CARI/Behavior coding Quantitative Methods (e.g., field tests, nonresponse analysis, imputation rates) Mixed-methods approach 3

FedCASIC March 19, 2014 Current Study Recent OMB recommendation to modify the answer categories for some questions on our Ag Labor Survey Opportunity to do mixed-methods test: – Cognitive Interviewing – Field Test – Behavior Coding 4

FedCASIC March 19, 2014 Ag Labor Survey Measures types/numbers and hours/wages of farm workers Conducted twice a year Data primarily collected by mail, with CATI follow up and in-person interviewing 5

FedCASIC March 19, 2014 Ag Labor Survey Cont. OMB recommended we use Standard Occupational Classification (SOC) Codes to measure types of hired labor Longer, more detailed list of occupations 6

FedCASIC March 19, 2014 Hired Labor Original Question 7

FedCASIC March 19, 2014 Revised Hired Labor Question 8

FedCASIC March 19, 2014 Cognitive Interviews Two rounds of testing (n=24) Only tested the questions related to categorizing workers (original and two revised versions) Interviews conducted by 9 interviewers in 6 states 9

FedCASIC March 19, 2014 Cognitive Interview Findings Order of worker codes and question was confusing for new version Respondent had trouble finding correct code – Major groupings; “Other” categories were not seen Respondents had trouble picking one code Examples of types of workers seemed helpful to respondents, but they caused incorrect answers Term ‘hired to do’ caused some confusion 10

FedCASIC March 19, 2014 Cognitive Interview Findings Issues we weren’t looking for – Definition of hired worker (e.g., family members, operator, children?) – Level of reporting (by category [what we wanted]or by worker?) – Wages and hours – some workers are paid different wages for different jobs 11

FedCASIC March 19, 2014 Paper Instrument Improvements Order of presentation of worker codes and question New layout of worker codes – More obvious major categories (e.g., spacing, fonts) – Removal of definition text and examples 12

FedCASIC March 19, 2014 CATI Instrument First ask respondents: What type of work were they hired to do? 1.Field work 2.Livestock work 3.Supervision/Management 4.Other work 13

FedCASIC March 19, 2014 CATI Instrument Then asked: More specifically, which type of work were they hired to do? (open ended - Interviewer then selects appropriate code based on the response) 14

FedCASIC March 19, 2014 Challenges & Limitations of CIs Tested the paper version only Cognitive interviewers with limited training/little or no experience Convenience sample/couldn’t sample large ops Must incorporate SOC codes 15

FedCASIC March 19, 2014 Field Test Respondents were administered the original question during the survey After survey completed, respondents were read the following statement and then asked the new question version: NASS is testing a new set of worker groups for the Agricultural Labor Survey, and we are asking for your help with this project. You will be asked to report on the same workers from the previous questions, separating the workers by the main type of work they were hired to do based on the expanded list of worker codes. 16

FedCASIC March 19, 2014 Field Test Findings Less useable records for the new version across all regions (ranged from 1%-19% by region) Comparisons of the estimates were made by region for number of paid workers Of the 51 possible comparisons made, 5 were statistical significant at the 0.05 level… BUT…. 17

FedCASIC March 19, 2014 Challenges & Limitations of FT Did not do a split sample field test Respondents always received the original question first Data from new question versions not edited; estimates not released Interviewer effects – more on this in a minute 18

FedCASIC March 19, 2014 Behavior Coding Quantitative, systematic coding of interviewer respondent interaction Use representative sample 19

FedCASIC March 19, 2014 Behavior Coding Audio recorded and captured screen shots of 500 interviews from the October 2013 Ag Labor Survey. Loaded these recordings/screen shots into CARI for behavior coding Plan is to code 50 interviews; using 5 coders 20

FedCASIC March 19, 2014 Behavior Coding 1 st level exchange - Interviewer behavior - Respondent behavior Final response - Respondent behavior - Data entry - Number of exchanges required 21

FedCASIC March 19, 2014 Findings None yet! Still trying to achieve good agreement Preliminary Findings: Interviewers skipping new questions and entering data from old questions. New questions are hard to administer; burdensome on interviewers & respondents. 22

FedCASIC March 19, 2014 Challenges & Limitations of BC Technical Challenges – Working with new CARI system – Sound Issues – Interviewers advancing too quickly/system lag 23

FedCASIC March 19, 2014 Challenges & Limitations of BC Methodological Challenges – poor agreement – complex instrument/conversational – interviewer variability – lack of training in BC 24

FedCASIC March 19, 2014 Outcomes Was this mixed-methods approach worth it? CI: Made changes to format of paper form – don’t know if this made improvements FT: Few differences in nonresponse in the CATI – but know based on BC new questions not always asked BC: No findings yet; but made aware of burden, flaws in field test Need to have more collaboration for testing 25

FedCASIC March 19, 2014 Moving forward How can we better plan out mixed-methods tests? How can we do better tests in production? How can we get all players on same page? (survey administrators, research staff, call center supervisors, enumerators all have different motivations.) 26

FedCASIC March 19, 2014 Questions? Comments? Heather Ridolfo 27