Presentation is loading. Please wait.

Presentation is loading. Please wait.

Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Similar presentations


Presentation on theme: "Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:"— Presentation transcript:

1 Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys: A Summary of the AAPOR Session

2 2011 AAPOR Session Are Measurement and Item Nonresponse Differences a Problem in Web and Mail Mixed-Mode Surveys? – 1) Millar & Dillman, “Do mail and web produce different answers? Mode differences in question response and item nonresponse rates” – 2) Smyth & Olson, “Comparing numeric and text open-end response in mail and web surveys.” – 3) Lesser, Olstad, Yang, & Newton, “Item nonresponse in web and mail response to general public surveys.” – 4) Messer, Edwards, & Dillman, “Determinants of web and mail item nonresponse in address-based samples of the general public.” – 5) Israel & Lamm, “Item nonresponse in a client survey of the general public.”

3 Background Item nonresponse and measurement differences have not been thoroughly tested in web and mail surveys, particularly of the general public Both types of error are thought to be similar between the modes since they are self- administered and visual, although some have found that mail obtains higher error rates (i.e. lower data quality)

4 PAPER 1: Millar & Dillman Tested item nonresponse and measurement error in web and mail surveys Used data from the WSU Student Experience Survey, in which the population has near universal access to the web and both postal and email addresses, in the Spring and Fall of 2009 – Spring 2009 study: 100 items, 36 questions – Fall 2009 study: 76 items, 33 questions

5 Response Rates

6 Methods Item nonresponse: Calculated percent of respondents missing a response and compared rates with z-tests Measurement: Used chi-square tests to test for differences in the distribution of responses

7 Item Nonresponse No statistically significant differences in overall item nonresponse rates between web and mail modes in each experiment

8 Item Nonresponse, cont. Some individual items exhibited significant mode differences in item nonresponse rates – Multi-item questions – Branching questions – Open-end questions

9 Measurement Differences Very few significant differences exist – Multi-item questions are most likely to exhibit measurement differences

10 PAPER 2: Smyth & Olson Tested web vs. mail measurement differences in open-ended questions, a question type more prone to error in self-administered, visual modes Used data from the 2009 Quality of Life in Changing Nebraska Survey (QLCN) – 45.6% total response rate

11 Methods Four experimental treatment groups – Mail-only, Mail+Web, Web+Mail, Web-only Tested two different open-end question formats – 7 number box items (e.g. date of birth) Measurement differences (distributions of answers to questions) Item nonresponse rates (missing or not missing) – 2 text box items (e.g. descriptions or narratives) Item nonresponse rates (missing or not missing) Quality (e.g. length, number of themes, elaboration)

12 Results Number Box: – Regression analyses resulted in very few significant differences in either responses or measurement indicators between web and mail modes – What few differences occurred were largely due to differential participation in web and mail modes, as well as questionnaire design Text Box: – Analyses show that item nonresponse rates and data quality were very similar across modes However, the differences found to be significant were largely due to questionnaire design and mode

13 PAPER 3: Lesser, Newton, & Yang Compared item nonresponse rates across modes and question types and compared unit response between mail, web, and telephone modes Used data from the 2008 & 2010 general public surveys conducted for the Oregon Department of Transportation – Address-based samples

14 Methods Four groups: – Telephone, Mail-only, Web+Mail, Web/Mail (choice) Five question types: – Likert, Open-end, Filtered, Tabular, & Demographic

15 Item Nonresponse Results Type Mail Web+Mail Web/Mail Likert 2.42.6 2.4 Filter0.92.7 2.7 Table4.95.6 5.2 Open 3.24.1 4.3 Demo 4.04.3 3.3 Overall3.54.1 3.9

16 Item Nonresponse Results: 2008 Type Mail Web/Mail Telephone Likert 1.81.7 0.65 Filter1.31.4 0.44 Table4.13.2 0.20 Open 5.54.3 2.55 Demo 3.73.6 4.05 Overall3.02.6 0.87

17 Unit Response Results

18 PAPER 4: Messer, Edwards, & Dillman Tested for item nonresponse differences controlling for survey mode and design, question types, and demographic characteristics Used data from three general public surveys with address- based samples: – 2007 Lewiston & Clarkston Quality of Life Survey (LCS) Web+Mail, 55%; Mail-only, 66% 92 items, 51 questions – 2008 Washington Community Survey (WCS) Web+Mail, 40%; Mail-only, 50% 110 items, 52 questions – 2009 Washington Economic Survey (WES) Web+Mail, 50%; Mail-only, 62% 96 items, 57 questions

19 Methods Survey modes: web, mail, & mail-follow-up – Survey designs: web+mail & mail-only Question Types – Screened, Multi-item, Open-end, Close-end Question Formats – Factual, Attitudinal, Behavioral Demographic Characteristics – Gender, age, education, income Item nonresponse rate: missing responses divided by total number of possible responses for each respondent

20 Item Nonresponse Rates by Survey Mode

21 Question Effects The same trend persists across questions types and formats, with web obtaining the lowest rate, followed by mail, and then mail follow-up in all three surveys In regression analyses controlling for both survey mode and question type & format (e.g. screened, multi-item, etc.) are significant predictors of item nonresponse

22 Demographic Effects The same trends in item nonreponse rates are also found across different demographic subgroups. We also found that older respondents with less education and income have higher item nonresponse rates – Regression analyses controlling for mode and demographics indicate this could be due to differential participation

23 Item Nonresponse Rates by Survey Design Web+mail and mail-only designs obtained statistically similar item nonresponse rates ranging between 4-8% Question Type and Format – Regression analyses indicate that question format is a significant predictor of item nonresponse, controlling for design Demographic Characteristics – Older respondents with lower levels of education and income tend to exhibit higher rates, controlling for design Survey design was not a significant predictor

24 PAPER 5: Israel & Lamm Tested for item nonresponse differences controlling for survey mode, question characteristics, and respondent demographics Used data from the 2008, 2009, 2010 University of Florida Extension Customer Satisfaction Survey

25 Methods Mail-only and Web-only modes Had postal and email addresses of clients Used logistic regression and HLM statistical methods Demographic Characteristics: – Gender, education, age, race, & client participation status Question Types – Open-end, screened, demographic, grid, & yes/no

26 Item Nonresponse Rates by Mode MailWeb 7.3%5.1% MailWeb 7.1%6.3% MailWeb 7.1%5.5% 2008 2009 2010

27 Completed Survey Rate by Mode MailWeb MailWeb Mail Web 28.2%47.2% 28.4%48.2% 30.0% 41.3% 20082009 2010

28 Determinants of Item Nonresponse Logistic regression analysis shows that demographic characteristics of respondents have little affect on item nonresponse Open-end and screened question types yielded the highest item nonresponse rates HLM analyses result in few mode or demographic effects  question characteristics

29 Tying it all together Millar & Dillman found few item nonresponse rate differences between web and mail in a population with easy access to both, however questions types were found to influence data quality net of mode For the general public, Smyth & Olson discovered that one particular question type obtained variable data quality depending on the survey mode and the format of the question (numeric vs. text), with web obtaining slightly lower nonresponse rates and better data quality In multiple general pubic web and mail surveys, Lesser et. al., Messer et. al., and Israel & Lamm found that web obtained lower item nonresponse rates. However, when combined with a mail follow-up, web+mail item nonresponse rates approximate those obtained by mail alone. In addition, question characteristics and respondent demographics (Messer et. al.) were found by to influence item nonresponse rates. Lesser et al. also showed that telephone obtained the lowest item nonresponse rates, as could be expected.

30 Where we stand now… Using similar questionnaires, web may obtain slightly better data quality than mail, at least in general public surveys, but also currently obtains lower unit response rates, indicating a trade-off. In addition, web may also attract types of respondents that have a greater propensity to complete the questionnaire compared to mail, and complex question types/formats can produce lower data quality net of survey mode.

31 In addition…. I also saw Danna L. Moore present “Driving Respondents to the Web: Experimental Trial of Benefit Appeals and Impacts on Survey Completion” – She tested the use of three different letter types One basic letter with no appeals One letter with an appeal to save the government money by using the web (instead of paper) One letter with an appeal to help out the local community – Results indicated that the appeal to save money had a positive, significant effect on web response rates, compared with the other two letters About 3 percentage points higher

32 Citations and Contact Info Millar, Morgan M. & Don A. Dillman. 2011. “Improving Response Rates to Web and Mixed-Mode Surveys.” Public Opinion Quarterly, 75:2(249-69) – morgan_millar@wsu.edu morgan_millar@wsu.edu Smyth, Jolene & Kristen Olson. Unpublished Manuscript. “Comparing Numeric and Text Open-End Responses in Mail & Web Surveys.” – jsmyth2@unl.edu jsmyth2@unl.edu Lesser, Virginia, Andy Olstad, Danny Yang & Lydia Newton. Unpublished Manuscript. “Item nonresponse in web and mail response to general public surveys.” – lesser@science.oregonstate.edu lesser@science.oregonstate.edu Messer, Benjamin & Don A. Dillman. Forthcoming. “Surveying the General Public Over the Internet Using Address-based Sampling and Mail Contact Procedures.” Public Opinion Quarterly – bmesser@wsu.edu bmesser@wsu.edu Israel, Glenn D. & Alex J. Lamm. Unpublished Manuscript. “Item-Nonresponse in a Client Survey of the General Public.” – gdisrael@ufl.edu gdisrael@ufl.edu


Download ppt "Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:"

Similar presentations


Ads by Google