Presentation on theme: "Department of Political Science"— Presentation transcript:
1 Department of Political Science Survey research techniques: Matching the method with the research questionRobert M. SteinDepartment of Political ScienceRice UniversityMarch 2015
2 Popular survey techniques Face-to-face interviewsMail surveysSelf-administered surveysTelephone interviewsLive interviewerInteractive voice recordedWeb based surveys (with/without telephone recruitment. e.g., Mechanical Turk, Knowledge networks, Goggle Consumer SurveysFocus groups
3 Major research issues with surveys SamplingCostReactivityPsychology of the survey responseThe ‘don’t know response’Response timeSocial desirabilityQuestion wording and placementMeaning of survey responses
4 Survey methods for studying who votes Web based surveysStrengths: Cost, sample size, timely data collection, survey panelsWeaknesses: Selection bias e.g., digital divideTelephone, face-to-face surveysStrengths: Sample size, limited to registered votersWeaknesses : Social desirability, reactivity, recall problems, sample selection bias (unlisted phone numberExit pollStrengths: Validated voters, limited number of questionsWeaknesses : Reactivity, unable to study non-voterIVR pollStrengths: Large sample, less invasive, lower costWeaknesses: Low response rate, limited number of questions, skewed sample‘Surveyless’ Survey (e.g., annotated voter histories)Strengths: Unambiguous, valid and reliable informationWeaknesses: Limited sample (i.e. registered voters) and bias time series
5 Overreporting votingRobert Bernstein, Anita Chadha and Robert Montjoy, “Overreporting Voting: Why it Happens and Why it Matters.” Public Opinion Quarterly 65(2002):22-44Allyson L. Holbrook and Jon A. Krosnick, “Measuring voter turnout by using the randomized response technique: Evidence calling into question the method’s validity,” Public Opinion Quarterly 74(2010):Brian D. Silver, Paul R. Abramson and Barbara A. Anderson, “The presence of others and overreporting of voting in American National Elections,” Public Opinion Quarterly 50(1986):Robert F. Belli, Sean E. Moore and John VanHoewyk, “An experimental comparison of question forms used to reduced vote overreporting,” Electoral Studies 25(2006):
6 Social expectation and over reporting of voting Treatment #1In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. How about you--did you vote in the elections this November?Treatment #2In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. Which of the following statements best describes you: One, I did not vote (in the election this November); Two, I thought about voting this time - but didn't; Three, I usually vote, but didn't this time; or Four, I am sure I voted?
10 Response time (latency) measures of contextual effects on voting behavior * Problem: Studies of contextual processes have always involved the possibility that if individuals’ aggregation into geographic units is not exogenous to their values on the dependent variable, then what appear to be “contextual processes” may be due solely to selection effects.Solution: measure contextual variables separately from the individual-level variables, and later to connect them to the survey data by means of linkage variables. Latency measures of response time to survey generated questions.Methodology: Cati (computer assisted telephone interviewing) technology used to generate latency measures to query to respondent’s assessment of the partisanship of their neighbors“M. Johnson, W. Phillips Shively and R.M. Stein, “Contextual data and the study of elections and voting behavior: connecting individuals to environments.” Electoral Studies 21(2002):
11 HypothesesWhen perception of partisanship of one’s neighbor’s (i.e., (generally speaking, do you usually think of your neighborhood as Republican, Democratic or Independent?) is latent (i.e., faster response time) context sould have a significant and positive effect on vote choice.Republicans (Democrats) residing in neighborhoods that perceive to be Democratic (Republican) will be more likely to vote for Democrats (Republicans) when their perception of the partisan make-up of their neighborhood is readily accessible.The accessibility of one’s context will be greater when that context is congruent with a respondent’s personal preference.
12 Research designLive telephone interview with 750 registered voters in Harris County, Texas, conducted September 23-29, 1999Retrospective vote choice in the 1996 Presidential election, partisanship and perception of partisanship of neighborhoodResponse times collected on Cati system similar to Bassli and Fletcher (1991).
16 An Answer: Risk perceptions and evacuations from hurricanes. Analyzing surveysWhen are answers to survey questions revealing of individual perceptions, preferences and behavior?How might multiple survey responses reveal more information than single responses?An Answer: Risk perceptions and evacuations from hurricanes.
17 Google consumer surveys (GCS) GCS is a new tool developed by Google that surveys a sample of Internet users as they attempt to view content on the websites of online publishers who are part of the program.These sites require users to complete a survey created by researchers in order to access premium content provided by the publishers.Currently, GCS is available in four countries and it takes about 48 hours to field a survey of any size. The 4 countries are USA, Canada, UK, and Australia.The current cost-structure of GCS makes very short surveys (even as short as one question) the most economically attractive.Since the respondents are not part of an online panel, GCS has not collected this information at an earlier date (as is the case in many panels). Instead, GCS provides researchers with a set of respondents’ “inferred demographics” (location, income, age, gender) that are derived from their browsing histories and IP addresses .
20 Limitations of GCSGCS recommends the question length to be 125 characters and sets the maximum limit to 175 characters.Censored questions and populations: GCS places restrictions on sensitive demographics information by prohibiting researchers from asking respondents for their age, gender, ethnicity, religion, and immigration status. Researchers can only ask these questions if the response choices include "I prefer not to say" as the opt out answer.GCS also requires an option for the respondents to not answer the question by either clicking an “I don’t know, show me another question” link.Observational studies that require statistical models including a long (or even small} list of control variables can likely be accomplished as efficiently using other survey tools.
21 Strengths of GCSFirst, if these experiments can be implemented in one question, then the considerable cost advantages of GCS over its competitors can be realized.Second, If true randomization of treatments can be achieved (and we argue that GCS’ methodology does so - and demonstrate resulting balance in a set of measured covariates then no additional control variables are necessary to make strong inferences.Finally, given the fact that most experimenters are much more concerned with the internal validity of their inferences than with tier external validity, unresolved questions about the representativeness of the GCS sample may be of secondary concern and certainly the sample is better than that of most lab experiments.