Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Slides:



Advertisements
Similar presentations
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Response Rate in Surveys Key resource: Dillman, D.A.,
Advertisements

Survey Methodology Nonresponse EPID 626 Lecture 6.
Self-Administered Surveys: Mail Survey Methods ChihChien Chen Lauren Teffeau Week 10.
David Fairris Tarek Azzam
1 Avalaura L. Gaither and Eric C. Newburger Population Division U.S. Census Bureau Washington, D.C. June 2000 Population Division Working Paper No. 44.
Children’s subjective well-being Findings from national surveys in England International Society for Child Indicators Conference, 27 th July 2011.
LIST QUESTIONS – COMPARISONS BETWEEN MODES AND WAVES Making Connections is a study of ten disadvantaged US urban communities, funded by the Annie E. Casey.
Challenges to Surveys Non-response error –Falling response rates over time Sample coverage –Those without landline telephones excluded –Growing.
1 Health Surveys January 2008 Diane Martin, MA, PhD.
EBI Statistics 101.
Chapter 11 Contingency Table Analysis. Nonparametric Systems Another method of examining the relationship between independent (X) and dependant (Y) variables.
Is Psychosocial Stress Associated with Alcohol Use Among Continuation High School Students? Raul Calderon, Jr. Ph.D., Gregory T. Smith, Ph.D., Marilyn.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 8 Using Survey Research.
Survey Research & Understanding Statistics
DTC Quantitative Research Methods Three (or more) Variables: Extensions to Cross- tabular Analyses Thursday 13 th November 2014.
Survey Methods: Communicating with Respondents
Survey Research Chapter 17: How To Design And Evaluate Research In Education James Blackwood AED 615 – Fall Semester 2006.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation ONLINE SURVEYS.
Quantitative Research
1 Health Status and The Retirement Decision Among the Early-Retirement-Age Population Shailesh Bhandari Economist Labor Force Statistics Branch Housing.
Frederick C. Van Bennekom, Dr.B.A. Helping Clients Listen Better Survey Program Training, Development & Targeted Advice Customer Service Strategic Positioning.
EMR 6500: Survey Research Dr. Chris L. S. Coryn Kristin A. Hobson Spring 2013.
Measures of Central Tendency
Targeting Research: Segmentation Birds of a feather flock together, i.e. people with similar characteristics tend to exhibit similar behaviors Characteristics.
CHAPTER 5: CONSTRUCTING OPEN- AND CLOSED-ENDED QUESTIONS Damon Burton University of Idaho University of Idaho.
Effects of Income Imputation on Traditional Poverty Estimates The views expressed here are the authors and do not represent the official positions.
Questionnaires and Interviews
Confronting the Challenges of Household Surveys by Mixing Modes Roger Tourangeau, Westat Note: This presentation accompanied the Keynote Address by Dr.
Determining Sample Size
Web-based Surveys: Changing the Survey Process, by Holly Gunn First Monday, volume 7, number 12 (December 2002), URL:
1 Kenneth E. Wallen a,b,c, Adam C. Landon a,b, Gerard T. Kyle a,b,c, Michael A. Schuett a,c, Jeremy Leitz d, & Ken Kurzawski d a Department of Recreation,
9/23/2015Slide 1 Published reports of research usually contain a section which describes key characteristics of the sample included in the study. The “key”
Slide 1 Incentives in Surveys with Farmers Third International Conference on Establishment Surveys Montreal, Quebec, Canada June 19, 2007 Slide Kathy Ott.
A statistical method for testing whether two or more dependent variable means are equal (i.e., the probability that any differences in means across several.
SW388R6 Data Analysis and Computers I Slide 1 Central Tendency and Variability Sample Homework Problem Solving the Problem with SPSS Logic for Central.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
Chapter Thirteen Validation & Editing Coding Machine Cleaning of Data Tabulation & Statistical Analysis Data Entry Overview of the Data Analysis.
Customer Satisfaction Surveys Colette Nicolle. 2 Overview  Overall response rate and suggestions  Process of analysis and reporting  Investigating.
Effectiveness of Monetary Incentives and Other Stimuli Across Establishment Survey Populations ICES III 2007 Montreal, Quebec Canada Danna Moore.
Public attitudes towards housing benefit and planning reform Results from Ipsos MORI Omnibus Survey May 2011.
Introduction to research methods 10/26/2004 Xiangming Mu.
Panel Study of Entrepreneurial Dynamics Richard Curtin University of Michigan.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 8 Part 2 Designing Research Studies SURVEY RESEARCH: BASIC METHODS OF COMMUNICATION.
The Challenge of Non- Response in Surveys. The Overall Response Rate The number of complete interviews divided by the number of eligible units in the.
Shane Lloyd, MPH 2011, 1,2 Annie Gjelsvik, PhD, 1,2 Deborah N. Pearlman, PhD, 1,2 Carrie Bridges, MPH, 2 1 Brown University Alpert Medical School, 2 Rhode.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
FCD CWI 1 The Foundation for Child Development Index of Child Well- Being (CWI) 1975 to 2004 with Projections for 2005 A Social Indicators Project Supported.
American Community Survey (ACS) Product Types: Tables and Maps Samples Revised
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Increasing Efficiency in Data Collection Processes Arie Aharon, Israel Central Bureau of Statistics.
1 Public Library Use in Oregon Results from the 2006 Oregon Population Survey Oregon State Library March 2007.
The measurement effect in PC smartphone and tablet surveys Valerija Kolbas University of Essex ISER Ipsos-MORI The European Survey Research Association.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
1 Introduction to Statistics. 2 What is Statistics? The gathering, organization, analysis, and presentation of numerical information.
Index of Child Well-Being The Foundation for Child Development Index of Child Well- Being (CWI) 1975 to 2002 with Projections for 2003 A Social Indicators.
CAPE ROAD SURGERY Patient Questionnaire 2013 / 2014.
Introduction/ Section 5.1 Designing Samples.  We know how to describe data in various ways ◦ Visually, Numerically, etc  Now, we’ll focus on producing.
Section 29.1 Marketing Research Chapter 29 conducting marketing research Section 29.2 The Marketing Survey.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
The impact of using the web in a mixed mode follow-up of a longitudinal birth cohort study: Evidence from the National Child Development Study Presenter:
An Experiment in Open End Response Quality in Relation to Text Box Length in a Web Survey Michael Traugott Christopher Antoun University of Michigan.
Influencing Mode Choice in a Multi-Mode Survey May 2012 AAPOR Conference Presentation Geraldine Mooney Cheryl De Saw Xiaojing Lin Andrew Hurwitz Flora.
Context for the experiment?
A Comparison of Two Nonprobability Samples with Probability Samples
Chapter Eight: Quantitative Methods
Surveys of Consumers: Mixed Mode Experiments
Agenda Applying the Tailored Design Method in a Randomized Control Trial Experiment Survey Benjamin Messer Research Into Action, Portland, OR PAPOR Annual.
Chapter 5: Producing Data
Presentation transcript:

Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys: A Summary of the AAPOR Session

2011 AAPOR Session Are Measurement and Item Nonresponse Differences a Problem in Web and Mail Mixed-Mode Surveys? – 1) Millar & Dillman, “Do mail and web produce different answers? Mode differences in question response and item nonresponse rates” – 2) Smyth & Olson, “Comparing numeric and text open-end response in mail and web surveys.” – 3) Lesser, Olstad, Yang, & Newton, “Item nonresponse in web and mail response to general public surveys.” – 4) Messer, Edwards, & Dillman, “Determinants of web and mail item nonresponse in address-based samples of the general public.” – 5) Israel & Lamm, “Item nonresponse in a client survey of the general public.”

Background Item nonresponse and measurement differences have not been thoroughly tested in web and mail surveys, particularly of the general public Both types of error are thought to be similar between the modes since they are self- administered and visual, although some have found that mail obtains higher error rates (i.e. lower data quality)

PAPER 1: Millar & Dillman Tested item nonresponse and measurement error in web and mail surveys Used data from the WSU Student Experience Survey, in which the population has near universal access to the web and both postal and addresses, in the Spring and Fall of 2009 – Spring 2009 study: 100 items, 36 questions – Fall 2009 study: 76 items, 33 questions

Response Rates

Methods Item nonresponse: Calculated percent of respondents missing a response and compared rates with z-tests Measurement: Used chi-square tests to test for differences in the distribution of responses

Item Nonresponse No statistically significant differences in overall item nonresponse rates between web and mail modes in each experiment

Item Nonresponse, cont. Some individual items exhibited significant mode differences in item nonresponse rates – Multi-item questions – Branching questions – Open-end questions

Measurement Differences Very few significant differences exist – Multi-item questions are most likely to exhibit measurement differences

PAPER 2: Smyth & Olson Tested web vs. mail measurement differences in open-ended questions, a question type more prone to error in self-administered, visual modes Used data from the 2009 Quality of Life in Changing Nebraska Survey (QLCN) – 45.6% total response rate

Methods Four experimental treatment groups – Mail-only, Mail+Web, Web+Mail, Web-only Tested two different open-end question formats – 7 number box items (e.g. date of birth) Measurement differences (distributions of answers to questions) Item nonresponse rates (missing or not missing) – 2 text box items (e.g. descriptions or narratives) Item nonresponse rates (missing or not missing) Quality (e.g. length, number of themes, elaboration)

Results Number Box: – Regression analyses resulted in very few significant differences in either responses or measurement indicators between web and mail modes – What few differences occurred were largely due to differential participation in web and mail modes, as well as questionnaire design Text Box: – Analyses show that item nonresponse rates and data quality were very similar across modes However, the differences found to be significant were largely due to questionnaire design and mode

PAPER 3: Lesser, Newton, & Yang Compared item nonresponse rates across modes and question types and compared unit response between mail, web, and telephone modes Used data from the 2008 & 2010 general public surveys conducted for the Oregon Department of Transportation – Address-based samples

Methods Four groups: – Telephone, Mail-only, Web+Mail, Web/Mail (choice) Five question types: – Likert, Open-end, Filtered, Tabular, & Demographic

Item Nonresponse Results Type Mail Web+Mail Web/Mail Likert Filter Table Open Demo Overall

Item Nonresponse Results: 2008 Type Mail Web/Mail Telephone Likert Filter Table Open Demo Overall

Unit Response Results

PAPER 4: Messer, Edwards, & Dillman Tested for item nonresponse differences controlling for survey mode and design, question types, and demographic characteristics Used data from three general public surveys with address- based samples: – 2007 Lewiston & Clarkston Quality of Life Survey (LCS) Web+Mail, 55%; Mail-only, 66% 92 items, 51 questions – 2008 Washington Community Survey (WCS) Web+Mail, 40%; Mail-only, 50% 110 items, 52 questions – 2009 Washington Economic Survey (WES) Web+Mail, 50%; Mail-only, 62% 96 items, 57 questions

Methods Survey modes: web, mail, & mail-follow-up – Survey designs: web+mail & mail-only Question Types – Screened, Multi-item, Open-end, Close-end Question Formats – Factual, Attitudinal, Behavioral Demographic Characteristics – Gender, age, education, income Item nonresponse rate: missing responses divided by total number of possible responses for each respondent

Item Nonresponse Rates by Survey Mode

Question Effects The same trend persists across questions types and formats, with web obtaining the lowest rate, followed by mail, and then mail follow-up in all three surveys In regression analyses controlling for both survey mode and question type & format (e.g. screened, multi-item, etc.) are significant predictors of item nonresponse

Demographic Effects The same trends in item nonreponse rates are also found across different demographic subgroups. We also found that older respondents with less education and income have higher item nonresponse rates – Regression analyses controlling for mode and demographics indicate this could be due to differential participation

Item Nonresponse Rates by Survey Design Web+mail and mail-only designs obtained statistically similar item nonresponse rates ranging between 4-8% Question Type and Format – Regression analyses indicate that question format is a significant predictor of item nonresponse, controlling for design Demographic Characteristics – Older respondents with lower levels of education and income tend to exhibit higher rates, controlling for design Survey design was not a significant predictor

PAPER 5: Israel & Lamm Tested for item nonresponse differences controlling for survey mode, question characteristics, and respondent demographics Used data from the 2008, 2009, 2010 University of Florida Extension Customer Satisfaction Survey

Methods Mail-only and Web-only modes Had postal and addresses of clients Used logistic regression and HLM statistical methods Demographic Characteristics: – Gender, education, age, race, & client participation status Question Types – Open-end, screened, demographic, grid, & yes/no

Item Nonresponse Rates by Mode MailWeb 7.3%5.1% MailWeb 7.1%6.3% MailWeb 7.1%5.5%

Completed Survey Rate by Mode MailWeb MailWeb Mail Web 28.2%47.2% 28.4%48.2% 30.0% 41.3%

Determinants of Item Nonresponse Logistic regression analysis shows that demographic characteristics of respondents have little affect on item nonresponse Open-end and screened question types yielded the highest item nonresponse rates HLM analyses result in few mode or demographic effects  question characteristics

Tying it all together Millar & Dillman found few item nonresponse rate differences between web and mail in a population with easy access to both, however questions types were found to influence data quality net of mode For the general public, Smyth & Olson discovered that one particular question type obtained variable data quality depending on the survey mode and the format of the question (numeric vs. text), with web obtaining slightly lower nonresponse rates and better data quality In multiple general pubic web and mail surveys, Lesser et. al., Messer et. al., and Israel & Lamm found that web obtained lower item nonresponse rates. However, when combined with a mail follow-up, web+mail item nonresponse rates approximate those obtained by mail alone. In addition, question characteristics and respondent demographics (Messer et. al.) were found by to influence item nonresponse rates. Lesser et al. also showed that telephone obtained the lowest item nonresponse rates, as could be expected.

Where we stand now… Using similar questionnaires, web may obtain slightly better data quality than mail, at least in general public surveys, but also currently obtains lower unit response rates, indicating a trade-off. In addition, web may also attract types of respondents that have a greater propensity to complete the questionnaire compared to mail, and complex question types/formats can produce lower data quality net of survey mode.

In addition…. I also saw Danna L. Moore present “Driving Respondents to the Web: Experimental Trial of Benefit Appeals and Impacts on Survey Completion” – She tested the use of three different letter types One basic letter with no appeals One letter with an appeal to save the government money by using the web (instead of paper) One letter with an appeal to help out the local community – Results indicated that the appeal to save money had a positive, significant effect on web response rates, compared with the other two letters About 3 percentage points higher

Citations and Contact Info Millar, Morgan M. & Don A. Dillman “Improving Response Rates to Web and Mixed-Mode Surveys.” Public Opinion Quarterly, 75:2(249-69) – Smyth, Jolene & Kristen Olson. Unpublished Manuscript. “Comparing Numeric and Text Open-End Responses in Mail & Web Surveys.” – Lesser, Virginia, Andy Olstad, Danny Yang & Lydia Newton. Unpublished Manuscript. “Item nonresponse in web and mail response to general public surveys.” – Messer, Benjamin & Don A. Dillman. Forthcoming. “Surveying the General Public Over the Internet Using Address-based Sampling and Mail Contact Procedures.” Public Opinion Quarterly – Israel, Glenn D. & Alex J. Lamm. Unpublished Manuscript. “Item-Nonresponse in a Client Survey of the General Public.” –