Presentation is loading. Please wait.

Presentation is loading. Please wait.

American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.

Similar presentations


Presentation on theme: "American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University."— Presentation transcript:

1 American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University Open-Ended Survey Questions: Non-Response Nightmare or Qualitative Data Dream?

2 Literature Review There is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010) Combination of the struggling economy, funding cuts to higher education, and the evolution of the traditional higher education model (i.e. distance education, MOOCs, etc.) Alumni surveys are an important tool for assessment, but often have lower responses rates (Smith & Bers, 1987) Due to bad contact information, suspicion of money solicitation, and decreased loyalty after graduation

3 Literature Review Despite lower response rates, qualitative data from open-ended questions can still provide rich information from relatively few respondents (Geer, 1991; Krosnick, 1999) Disadvantages of open-ended questions: Heavy burden on respondents (Dillman, 2007) Some personal characteristics, such as language fluency and positive affect, can impact likelihood of responding to open-ended questions (Wallis, 2012)

4 Research Questions Do open-ended responses represent the opinions of the entire sample? Are some types of respondents more likely to complete these questions? Does question placement on the survey impact responses? The purpose of this study is to explore whether those with certain demographic and personal characteristics, including gender, age, cohort, number of children, marital status, citizenship, race, current employment status, income, and institutional satisfaction level, are more or less likely to respond to open-ended questions placed at the beginning, middle, and end of an online alumni survey.

5 Method: Participants Data from the 2011 administration of the Strategic National Arts Alumni Project (SNAAP) Participants were 33,801 alumni from 57 different arts high schools, arts colleges, or arts programs within larger universities Sample consisted of 8% high school level, 70% undergraduate level, and 22% graduate level alumni 38% male, 62% female,.2% transgender Majority (87%) reported ethnicity as Caucasian Average institutional response rate: 21% Only used those who completed the entire survey (i.e. did not drop out before the end) N = 27,212

6 What is SNAAP? On-line annual survey designed to assess and improve various aspects of arts-school education Investigates the educational experiences and career paths of arts graduates nationally Questionnaire topics include: Formal education and degrees Institutional experience and satisfaction Postgraduate resources for artists Career Arts engagement Income and debt Demographics

7 Method: Open-Ended Measures From beginning of survey (Question 17 of 82) “Is there anything that [your institution] could have done better to prepare you for further education or for your career? Please describe.” From middle (Question 44 of 82) “Please describe how your arts training is or is not relevant to your current work.” From end (Question 80 of 82) “If there are additional things you would like to tell us about your education, life, and/or career that were not adequately covered on the survey, please do so here.”

8 Method: Demographic Measures Demographic information collected for: Gender (Categorical- 3) Age group (Ordinal ranges) Graduation cohort (Ordinal ranges) Number of children (Ordinal ranges) Marital status (Categorical – 4) Citizenship (Binary) Race/ethnicity “check all” (Binary – 7 total) Current employment status (Categorical – 7) Income (Ordinal midpoints of ranges) Institutional satisfaction level (Ordinal - 4-point Likert)

9 Analyses Series of 14 chi-squared analyses was done for each of the 3 open-ended question binary variables For gender, age group, graduation cohort, number of children, marital status, citizenship, each race/ethnicity option, and current employment status 3 independent samples t-tests completed for institutional satisfaction level and each of the 3 open-ended variables 3 non-parametric Mann-Whitney U tests were completed for income comparisons Skewed variance violated t-test assumptions

10 Results: Descriptive Statistics Much higher percentages of responses for beginning and middle questions than for the end question:

11 Results: Chi-Squared Analyses

12 Females more likely to respond Over 50 years of age more likely to respond Graduating in or before 1990 more likely to respond Singles less likely to respond Those with no dependent children more likely to respond Unemployed, retired, or with “other” employment status more likely to respond U.S. citizens more likely to respond Race: Asians less likely to respond, but those with “other” race/ethnicity more likely to respond

13 Results: Means and Other Ordinal Comparisons Those answering beginning and end questions were significantly less satisfied with overall institutional experience Didn’t Answer Item: Mean Answered Item: Meant valuedf Effect Size (d) Near-Beginning Item3.573.4020.33***19914.16.26 Middle Item3.443.45-.90727082.01 Near-end Item3.463.424.20***9767.14.06 *p<.05; **p<.01; ***p<.001

14 Results: Means and Other Ordinal Comparisons Those answering the open-ended questions had a significantly lower income that those who did not, across all questions Didn’t Answer Item: Mean Rank Answered Item: Mean Rank Mann-Whitney U value Near-Beginning Item13860.6013173.3974282028.0*** Middle Item12056.1411307.1054347479.5*** Near-end Item12099.8611409.5240097544.5*** *p<.05; **p<.01; ***p<.001

15 Discussion Many patterns of results are consistent with previous literature English language fluency is a factor in responding to questions that require language production (Wallis, 2012) U.S. citizens more likely to respond Open-ended responses require greater time and mental effort Those with no dependents, unemployed, retired, and from older graduation cohorts more likely to respond

16 Discussion Those with negative feelings may be more likely to voice their opinions in the open-ended items Negativity bias found in research with workplace environments (Poncheri et al., 2008) Explains why unemployed more likely to respond Also explains why those who provided responses have significantly lower income and are less satisfied with institutional experience An analysis of the content of the comments might shed more light on this interpretation

17 Discussion Also found an interesting pattern of an “other” response style For current employment status and race/ethnicity, those who select “other” are more likely to respond to open- ended questions Cursory review of the text boxes that accompany the “other” options shows that many actually do fall into a listed categorization Example: writing in “Caucasian/American Indian” even those both of these were represented in the race/ethnicity check all Are these people just more verbose? Or do they have a disposition that resists the confinement of categorization?

18 Conclusions Limitations of study: sample may not be completely representative of all arts alumni (response rates and selective participation) Assessing alumni can provide important information on institutional effectiveness, but must be cognizant that some groups are more likely to provide open- ended responses More research is needed on the influence of question placement for particular groups, as well as personal and environmental influences contributing to the “other” survey response style

19 References Dillman, D. (2007). Mail and internet surveys: The tailored design method (2 nd ed.). New York: Wiley. Geer, J.G. (1991). Do open-ended questions measure “salient” issues? Public Opinion Quarterly, 55, 360-370. Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1-20. Krosnick, J. (1999). Survey research. Annual Review of Psychology, 50, 537-567. Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and cost-benefit analysis. Research in Higher Education, 27(3), 218-225. Poncheri, R.M., Lindberg, J.T., Thompon, L.F., & Surface, E.A. (2008). A comment on employee surveys: Negativity bias in open-ended responses. Organization Research Methods, 11(3), 614-630. Wallis, P. (2012, April). Profiling college students who skip open-ended items win questionnaires with varied item formats. Paper presented at the Annual Meeting of the American Educational Research Association, Vancouver, British Columbia.

20 Questions or Comments? Contact Information: Angie L. Miller anglmill@indiana.eduanglmill@indiana.edu Amber D. Lambert adlamber@indiana.eduadlamber@indiana.edu Strategic National Arts Alumni Project (SNAAP) www.snaap.indiana.edu (812) 856-5824 snaap@indiana.edu


Download ppt "American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University."

Similar presentations


Ads by Google