Presentation is loading. Please wait.

Presentation is loading. Please wait.

Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook.

Similar presentations


Presentation on theme: "Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook."— Presentation transcript:

1 Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations
Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook

2 Response Rates Response rates are a critical indicator of data quality for surveys, and we as survey researchers are always trying to get to 100% response For mixed-mode or web surveys, invitations are widely used as the first point of contact to reach respondents We wanted to find out what happens if you begin data collection in the initial survey invitation itself?

3 Literature Review personalization sometimes boosts response rates (Heerwegh, 2005), but sometimes has no effect (Porter & Whitcomb, 2003) A white background and simple header had higher response rates than other conditions (Whitcomb & Porter, 2004) Mentioning the purpose of an (requesting for survey participation) and the sponsor of the survey in the subject line also had an impact on survey participation (Porter & Whitcomb, 2005) Several other factors, including the length of the , placement of the URL, and the estimated time of the survey, have also been explored in survey experiments in order to improve survey participation and response rate (Kaplowitz, Lupi, Couper, & Thorp, 2012; Keusch, 2012; Trouteaud, 2004).

4 Research Question New product feature: ability to embed survey question within an Research Question: Do survey invitations that include an embedded question have higher response rates than standard invitations? Click rate Completion rate Data quality check: comparison of responses to the first question

5 Embedded Question Screenshot: embedded survey invitation email
Screenshot: standard survey invitation

6 Experiment Design Sample frame: group of SurveyMonkey customers who had agreed to participate in research projects and provided their addresses Data collected July 27 – August 8, 2016 Initial invitation to complete a survey, follow-up reminder 4 days later Random assignment into 1/2 conditions: standard and embedded first question 4333 valid s for embedded condition 4347 valid s for standard condition 13-question survey, identical in each condition Experience with SurveyMonkey Satisfaction with the survey platform Interest in additional features

7 Results – Improved email click rate
Higher click rate for embedded survey than the standard survey (32.0% vs. 26.2%) Statistically significant p<.001 This means that respondents in the embedded condition were more likely to click on the embedded question and start the survey than the respondents in the standard condition to click on the “Begin survey” button.

8 Results – Improved survey completion rate
Higher completion rate for the embedded survey than the standard survey (29.1% vs. 24.4%) Statistically significant p<.001 Completion rate = # complete # sent

9 Results – Small negative effect on survey drop-out rate
Valid Clicked Completed % Completed/Clicked Embedded 4333 1388 1261 90.8% Standard 4347 1141 1059 92.8% The proportion of respondents who completed the survey divided by the number who clicked into the survey was slightly higher for those in the standard condition… Which means that the embedded version had a slightly higher drop-out rate… But this difference is not statistically significant (p=.07).

10 Results – No effect on data quality
Do we get different responses when we ask a question embedded in an vs. in the survey itself? No. The ratio of the two NPS scores between the embedded and standard invites was 0.98, suggesting the two responses to the first question were almost identical for the two conditions.

11 Summary of Results Using an ed survey invitation with the first question embedded: Improves the click rate Improves the survey completion rate Has only a small negative effect on the survey drop-out rate Has no effect on data quality, in terms of responses to the first (embedded) question

12 Discussion Overall, successful test for adding a new feature.
Additional advantage: even if respondents drop out of the survey, their answers to the first question will be recorded in the embedded condition. In the standard condition, if respondents drop out before completing the first page, all data will be lost.      Future research opportunities: Embedding more than one question (or the whole survey?) in the itself Experimenting with different survey lengths and question types Will this work with a different population of respondents?

13 Thank you Contact us at:


Download ppt "Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook."

Similar presentations


Ads by Google