Presentation is loading. Please wait.

Presentation is loading. Please wait.

Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana.

Similar presentations


Presentation on theme: "Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana."— Presentation transcript:

1 Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University Innovations in Collecting and Reporting Complex Survey Data

2 Background & Purpose There is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010), and surveys are a common way to accomplish this Online data collection allows researchers to incorporate several programming-based components into surveys (Dillman, 2007)

3 Background & Purpose Complex survey features include: Skip logic - respondents receive follow-up questions based on answers to filter questions Populating response options - response options available are based on answers to earlier questions Filling in question stems Java-enabled elements to prevent inconsistent responses Intended to ease process of taking the survey from the respondent's perspective But can complicate data management and reporting for the researcher

4 Examples from the Strategic National Arts Alumni Project (SNAAP)

5 SNAAP As an example, we will discuss some of the ways that the Strategic National Arts Alumni Project (SNAAP) handles and reports complex data What is SNAAP? On-line annual survey designed to assess and improve various aspects of arts-school education Investigates the educational experiences and career paths of arts graduates nationally Findings are provided to educators, policymakers, and philanthropic organizations to improve arts training, inform cultural policy, and support artists

6 Who does SNAAP survey? Participants drawn from: Arts high schools Independent arts colleges Arts schools, departments, or programs in comprehensive colleges/universities Over 5 years, SNAAP has been administered at nearly 300 institutions of various focuses, sizes, and other institutional characteristics Cohort Year Sampling 2008 and 2009 Field Tests: 5, 10, 15, & 20 years out 2010 Field Test: 1-5, 10, 15, & 20 years out 2011 and forward: all years to generate the most comprehensive data possible

7 Increasing Numbers… 2010 Field Test Over 13,000 respondents 154 Institutions 2011 Administration More than 36,000 respondents 66 institutions 2012 Administration More than 33,000 respondents 70 institutions Now able to combine 2011 and 2012 respondents to create a “SNAAP Database” with over 68,000 respondents

8 Questionnaire Topics Formal education and degrees Institutional experience and satisfaction Postgraduate resources for artists Career Arts engagement Income and debt Demographics

9 Data Management Issues: Skip Logic Treating skip logic as a valid data point If a respondent does not receive a question due to his answer on a previous question, leaving him as “missing” on the follow-up question can have erroneous implications for the data

10 Data Management Issues: Skip Logic SNAAP example If answer “No” to this item: Then do NOT receive this item:

11 Data Management Issues: Skip Logic SNAAP example Asking about length of time spent working as an artist would not make sense for someone who has not worked as an artist But leaving data point as “missing” does not differentiate the non-artists from: Those who saw the question and chose not to answer it (non-response), and Those who did not complete the survey (break-off)

12 Data Management Issues: Skip Logic SNAAP example To address this issue, we developed a system for coding those who did not receive questions due to skip logic with valid response values Assigned them negative values (-1, -2, etc.) Multiple negative values if multiple reasons why respondents might not receive the question Non-response and break-off cases left as missing data points

13 Data Management Issues: Skip Logic SNAAP example What about those who dropped out of the survey, but would not have received the follow-up questions based on their earlier answers? Incorporated into coding system a rule for missing values for all questions past each respondent’s break-off point, regardless of answers to filter questions Do not want fluctuations in number of valid responses as progress through survey instrument (should be decreasing as people drop out)

14 Data Management Issues: Inconsistent Responses A respondent might answer a filter question one way, and then receive (and answer) the follow-up question, but back up in the survey and change the filter question (so they should NOT have received the follow-up) SNAAP example: Respondent says they are currently a professional artist, and answer “less than 1 year” for length, then go back and change their response to say that they have never been a professional artist Which response is the right one?

15 Data Management Issues: Inconsistent Responses SNAAP example: Imposed a rule that keeps the most “recent” response as accurate This necessitates altering original responses on follow-up questions that are no longer relevant Would make the “less than one year” response into a negative value (did not receive due to skip logic)

16 Reporting Issues: Skip Logic Because valid response values are assigned to follow- up questions on which some respondents were skipped, when reporting frequencies for each question, it is imperative to visually notate (Sanders & Filkins, 2009) which values are assigned due to skip logic, and which values are responses on the survey itself Keeping skip logic values allows one to make statements about the entire sample, rather requiring an “of those who…” statement throughout the report

17 Reporting Issues: Skip Logic SNAAP Example: Italicized skip logic labels in reports

18 Reporting Issues: Skip Logic Also have option of presenting only frequencies for those who received the question and leaving those skipped as missing Problematic if consumers of the report do not thoroughly read these statements (Suskie, 1996) then may easily misrepresent the survey results Reporting those skipped on the question prevents distortion of data

19 Reporting Issues: Skip Logic SNAAP Example: Only provide PDF version of this report to deter copying/pasting without accompanying introduction language

20 Reporting Issues: Skip Logic Questions that populate from previous “check all” lists should also be treated like other questions using skip logic SNAAP Example: Asks respondents to check all that apply for a list of jobs associated with the arts in which they have EVER worked Also do the same for a list of jobs outside of the arts Then only those they selected appear in a later question, this time asking them which of these they CURRENTLY work

21 Reporting Issues: Skip Logic SNAAP Example: Jobs associated with the arts EVER

22 Reporting Issues: Skip Logic SNAAP Example: CURRENT jobs

23 Reporting Issues: Skip Logic SNAAP Example: What appeared as a “check all” in the survey is presented like other “radio button” items

24 Importance of Codebook Integrating skip logic into reporting highlights the need to maintain a detailed codebook for reference Dynamic surveys should include several components in their codebook: Who receives each question Which response options are populated based on previous answers Which places have individual words/phrases filled in a question stem

25 Importance of Codebook SNAAP example:

26 Importance of Codebook SNAAP example:

27 References Dillman, D. (2007). Mail and internet surveys: The tailored design method (2 nd ed.). New York: Wiley. Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), Sanders, L. & Filkins, J. (2009). Effective reporting (2 nd ed.). Tallahassee, FL: Association for Institutional Research. Suskie, L.A. (1996). Questionnaire survey research: What works (2 nd ed.). Tallahassee, FL: Association for Institutional Research.

28 Questions or Comments? Contact Information: Angie L. Miller Amber D. Lambert Strategic National Arts Alumni Project (SNAAP) (812)


Download ppt "Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana."

Similar presentations


Ads by Google