American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.

Slides:



Advertisements
Similar presentations
First-Year Graduate Student Survey INTRODUCTION As part of the Graduate Schools recruitment and retention efforts, a graduate student survey was developed.
Advertisements

CHAPTER TWELVE ANALYSING DATA I: QUANTITATIVE DATA ANALYSIS.
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Assessment Institute October 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, Indiana University Assessment with.
Is College Success Associated With High School Performance? Elizabeth Fisk, Dr. Kathryn Hamilton (Advisor), University of Wisconsin - Stout Introduction.
Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana.
Virginia’s Employer Follow-up: An Examination of Response and Non- Response Patterns Presenters: Eric Lichtenberger Jim Washington.
For more information, please contact Katherine Salamon at Describing the Typical Complementary and Alternative Medicine (CAM) User Katherine.
American College Personnel Association March 31, 2014 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.
University as Entrepreneur A POPULATION IN THIRDS Arizona and National Data.
Using SNAAP Data for Positive Change 3 Million Stories March 8-9, 2013 Sally Gaskill, Director Amber D. Lambert, Research Analyst Angie L. Miller, Research.
EBI Statistics 101.
Designing and Analyzing Questionnaires
NETMCDO 2015 Sally Gaskill Indiana University Center for Postsecondary Research The Debt Issue: What do SNAAP Data Say?
Survey of Earned Doctorates National Science Foundation Division of Science Resources Statistics Mark Fiegener, Ph.D. Presentation to Clemson University.
1 WELL-BEING AND ADJUSTMENT OF SPONSORED AGING IMMIGRANTS Shireen Surood, PhD Supervisor, Research & Evaluation Information & Evaluation Services Addiction.
Chi-square Test of Independence
Survey Research & Understanding Statistics
Team Bivariate Chris Bulock Chris Bulock Michael Mackavoy Michael Mackavoy Jennifer Masunaga Jennifer Masunaga Ann Pan Ann Pan Joe Pozdol Joe Pozdol.
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
By Sanjay Kumar, Ph.D National Programme Officer (M&E), UNFPA – India
Association for Institutional Research Annual Forum May 2014 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, School of.
SURVEYS Sherry Woosley & Cindy Miller. Overall Surveys.
Abstract Institutions may be interested in using alumni career success as evidence of institutional effectiveness, but the current study suggests that.
Barbara Hauptman Sally Gaskill Carly Rush AAAE June 1, 2012.
BASIC STATISTICS WE MOST OFTEN USE Student Affairs Assessment Council Portland State University June 2012.
 Background information  Age, sex, grade, education level, etc.  Only ask if relevant to your study  Do not ask for real names  May want to define.
Quality of life of older adults who use social care support and their unpaid carers Stacey Rand & Juliette Malley.
Enhancing Parents’ Role in Higher Education Assessment Anne Marie Delaney Director of Institutional Research, Babson College.
Factors that Associated with Stress in Nursing Faculty in Thailand
Knowledge, Cancer Fatalism and Spirituality as Predictors of Breast Cancer Screening Practices for African American and Caucasian Women Staci T. Anderson,
A Deeper Understanding of Avery Fitness Center Customers
Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
Questionnaires and Interviews
Cultural Difference: Investment Attitudes and Behaviors of High Income Americans Tahira K. Hira – Iowa State University
Cognitive Interviewing for Question Evaluation Kristen Miller, Ph.D. National Center for Health Statistics
JENNY SMITH CAROLINE SMITH GEORGE SANDERS AMELIA THORNTON EMMA CLYDE-SMITH The Impact of Financial Circumstances on Student Health Jessop, Herberts, &
Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the.
American Pride and Social Demographics J. Milburn, L. Swartz, M. Tottil, J. Palacio, A. Qiran, V. Sriqui, J. Dorsey, J. Kim University of Maryland, College.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
39 th International Conference on Social Theory, Politics, and the Arts October 25, 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary.
Leadership Pulse™ Energy and Age Dr. Theresa M. Welbourne Preliminary Report April, 2006 the measure of your success.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
Hypotheses & Theory Methods of Data Collection How did we analyze the data collected? Dan Breen, Jessica Gossett, Jared Hause, Allison Hoppe, Fred Hubert,
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Results For both samples, we found that those who wrote comments focused on the strengths of the volunteer program (76% in Sample 1 and 63% in Sample 2)
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Physics Education Research The University of Edinburgh Attitudes and Beliefs about Physics from a UK Academics’ Perspectives Robyn Donnelly 1, Cait MacPhee.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
March 22,2007Lisa A. Lauxman, Ph.D On-Line Surveys Arizona Cooperative Extension Professional Development Workshop.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Exploration of the Academic Experience of International Students Studying Project Management *Dr Reda M Lebcir, Hany Wells and Angela Bond The Business.
Hugo Horta Center for the Advancement of Higher Education, Tohoku University Japan CIES-ISCTE, Portugal.
Janis L. Whitlock Cornell University.   Previous research show that human beings develop in multiple social ecologies but school connectedness and the.
Statistical Questioning Lesson After completing this lesson, you will be able to say: I can recognize and write a statistical question. I can recognize.
1. Columbus-based Strategic Research Group (SRG) is offering a Parent Engagement Survey to schools across Ohio. Open to all school buildings and districts.
HELEN ROSENBERG UNIVERSITY OF WISCONSIN-PARKSIDE SUSAN REED DEPAUL UNIVERSITY ANNE STATHAM UNIVERSITY OF SOUTHERN INDIANA HOWARD ROSING DEPAUL UNIVERSITY.
Continuing Education Provincial Survey Winter 2012 Connie Phelps Manager, Institutional Research & Planning.
Sociology. Sociology is a science because it uses the same techniques as other sciences Explaining social phenomena is what sociological theory is all.
1 ASN Partners with SNAAP: An Opportunity for Arts High Schools Presenters: Sally Gaskill, SNAAP Director and Rebecca F. Houghton, SNAAP Project Coordinator.
Benefits of part-time higher education study: A three-year follow-up study Presentation to UALL April 2007 Anne Jamieson Birkbeck, University of London.
Dr Grant Blank Prepared for the General Online Research conference, Cologne, Germany, 5 March 2014 Who uses Twitter? Representativeness of Twitter Users.
Outline Sampling Measurement Descriptive Statistics:
Lecture 3 Variables, Relationships and Hypotheses
Qualitative and Quantitative Data
The Impact of Lexical Complexity on the Public’s Understanding of
A comparative study of UNA students vs
Presentation transcript:

American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University Open-Ended Survey Questions: Non-Response Nightmare or Qualitative Data Dream?

Literature Review There is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010) Combination of the struggling economy, funding cuts to higher education, and the evolution of the traditional higher education model (i.e. distance education, MOOCs, etc.) Alumni surveys are an important tool for assessment, but often have lower responses rates (Smith & Bers, 1987) Due to bad contact information, suspicion of money solicitation, and decreased loyalty after graduation

Literature Review Despite lower response rates, qualitative data from open-ended questions can still provide rich information from relatively few respondents (Geer, 1991; Krosnick, 1999) Disadvantages of open-ended questions: Heavy burden on respondents (Dillman, 2007) Some personal characteristics, such as language fluency and positive affect, can impact likelihood of responding to open-ended questions (Wallis, 2012)

Research Questions Do open-ended responses represent the opinions of the entire sample? Are some types of respondents more likely to complete these questions? Does question placement on the survey impact responses? The purpose of this study is to explore whether those with certain demographic and personal characteristics, including gender, age, cohort, number of children, marital status, citizenship, race, current employment status, income, and institutional satisfaction level, are more or less likely to respond to open-ended questions placed at the beginning, middle, and end of an online alumni survey.

Method: Participants Data from the 2011 administration of the Strategic National Arts Alumni Project (SNAAP) Participants were 33,801 alumni from 57 different arts high schools, arts colleges, or arts programs within larger universities Sample consisted of 8% high school level, 70% undergraduate level, and 22% graduate level alumni 38% male, 62% female,.2% transgender Majority (87%) reported ethnicity as Caucasian Average institutional response rate: 21% Only used those who completed the entire survey (i.e. did not drop out before the end) N = 27,212

What is SNAAP? On-line annual survey designed to assess and improve various aspects of arts-school education Investigates the educational experiences and career paths of arts graduates nationally Questionnaire topics include: Formal education and degrees Institutional experience and satisfaction Postgraduate resources for artists Career Arts engagement Income and debt Demographics

Method: Open-Ended Measures From beginning of survey (Question 17 of 82) “Is there anything that [your institution] could have done better to prepare you for further education or for your career? Please describe.” From middle (Question 44 of 82) “Please describe how your arts training is or is not relevant to your current work.” From end (Question 80 of 82) “If there are additional things you would like to tell us about your education, life, and/or career that were not adequately covered on the survey, please do so here.”

Method: Demographic Measures Demographic information collected for: Gender (Categorical- 3) Age group (Ordinal ranges) Graduation cohort (Ordinal ranges) Number of children (Ordinal ranges) Marital status (Categorical – 4) Citizenship (Binary) Race/ethnicity “check all” (Binary – 7 total) Current employment status (Categorical – 7) Income (Ordinal midpoints of ranges) Institutional satisfaction level (Ordinal - 4-point Likert)

Analyses Series of 14 chi-squared analyses was done for each of the 3 open-ended question binary variables For gender, age group, graduation cohort, number of children, marital status, citizenship, each race/ethnicity option, and current employment status 3 independent samples t-tests completed for institutional satisfaction level and each of the 3 open-ended variables 3 non-parametric Mann-Whitney U tests were completed for income comparisons Skewed variance violated t-test assumptions

Results: Descriptive Statistics Much higher percentages of responses for beginning and middle questions than for the end question:

Results: Chi-Squared Analyses

Females more likely to respond Over 50 years of age more likely to respond Graduating in or before 1990 more likely to respond Singles less likely to respond Those with no dependent children more likely to respond Unemployed, retired, or with “other” employment status more likely to respond U.S. citizens more likely to respond Race: Asians less likely to respond, but those with “other” race/ethnicity more likely to respond

Results: Means and Other Ordinal Comparisons Those answering beginning and end questions were significantly less satisfied with overall institutional experience Didn’t Answer Item: Mean Answered Item: Meant valuedf Effect Size (d) Near-Beginning Item *** Middle Item Near-end Item *** *p<.05; **p<.01; ***p<.001

Results: Means and Other Ordinal Comparisons Those answering the open-ended questions had a significantly lower income that those who did not, across all questions Didn’t Answer Item: Mean Rank Answered Item: Mean Rank Mann-Whitney U value Near-Beginning Item *** Middle Item *** Near-end Item *** *p<.05; **p<.01; ***p<.001

Discussion Many patterns of results are consistent with previous literature English language fluency is a factor in responding to questions that require language production (Wallis, 2012) U.S. citizens more likely to respond Open-ended responses require greater time and mental effort Those with no dependents, unemployed, retired, and from older graduation cohorts more likely to respond

Discussion Those with negative feelings may be more likely to voice their opinions in the open-ended items Negativity bias found in research with workplace environments (Poncheri et al., 2008) Explains why unemployed more likely to respond Also explains why those who provided responses have significantly lower income and are less satisfied with institutional experience An analysis of the content of the comments might shed more light on this interpretation

Discussion Also found an interesting pattern of an “other” response style For current employment status and race/ethnicity, those who select “other” are more likely to respond to open- ended questions Cursory review of the text boxes that accompany the “other” options shows that many actually do fall into a listed categorization Example: writing in “Caucasian/American Indian” even those both of these were represented in the race/ethnicity check all Are these people just more verbose? Or do they have a disposition that resists the confinement of categorization?

Conclusions Limitations of study: sample may not be completely representative of all arts alumni (response rates and selective participation) Assessing alumni can provide important information on institutional effectiveness, but must be cognizant that some groups are more likely to provide open- ended responses More research is needed on the influence of question placement for particular groups, as well as personal and environmental influences contributing to the “other” survey response style

References Dillman, D. (2007). Mail and internet surveys: The tailored design method (2 nd ed.). New York: Wiley. Geer, J.G. (1991). Do open-ended questions measure “salient” issues? Public Opinion Quarterly, 55, Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), Krosnick, J. (1999). Survey research. Annual Review of Psychology, 50, Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and cost-benefit analysis. Research in Higher Education, 27(3), Poncheri, R.M., Lindberg, J.T., Thompon, L.F., & Surface, E.A. (2008). A comment on employee surveys: Negativity bias in open-ended responses. Organization Research Methods, 11(3), Wallis, P. (2012, April). Profiling college students who skip open-ended items win questionnaires with varied item formats. Paper presented at the Annual Meeting of the American Educational Research Association, Vancouver, British Columbia.

Questions or Comments? Contact Information: Angie L. Miller Amber D. Lambert Strategic National Arts Alumni Project (SNAAP) (812)