Presentation on theme: "New Patterns in Response Rates: How the Online Format Has Changed the Game Presented by David Nelson, PhD Purdue University."— Presentation transcript:
New Patterns in Response Rates: How the Online Format Has Changed the Game Presented by David Nelson, PhD Purdue University
National Switch to Online Format Recognized Benefits – Save paper and paper costs – Administrative time – Rapid results – Easy tracking of data – Protected anonymity for qualitative feedback
National Switch to Online Format Recognized Drawbacks – Possible technical issues – Changing ratings – Lower response rates National Research – October 2003 New Directions for Teaching & Learning – 2010 IDEA Center Study
Context at Purdue University – Your Mileage May Vary Demographics – ~35,000 students are eligible to participate – ~3,500 instructors in ~6,000 sections – ~750 distinct surveys each semester – significant department autonomy in evaluation administration Implementation – Select pilot test – Phased, 3-semester rollout – 75 academic units
Faculty Concerns The Ratings Will Change Only the Students with Strong Opinions Will Respond Why Does the Feedback Matter if Only Half the Students Provide It?
Decreasing Response Rate Pilot Tests – 69%; 72%; 70% First Semester – 63% Third Semester – 61% Spring 2011 – 55%
Why & What Does This Mean? Are the Numbers Different? Are Students Refusing to Do This? Why Is the Response Rate Decreasing? How Is the Format Change Affecting Responses? What Are Short and Long Term Solutions?
Are the Numbers Different? NO – Grouped Data Median Rating – Average (Mean) Numbers Same – No Changes in Ends of the Scale YES – Individual Instructors – Small and Upper Division Affected – Perception and Value
Are the Students Refusing to Do Evaluations? YES – Almost 1/3 rd of students never log in – Response rate is lower than on paper – Rate is progressively declining NO – 72% of students complete at least one survey – This number is constant, independent of global response rate decrease So…..
Why is Response Rate Decreasing and Where is the Discrepancy? The Reason: Partial Respondents
Travel Back to 1995 Evaluations Conducted via Paper No Method for Tracking Which Students Responded Respondents Are Those in Attendance Those Presented with Paper Evaluation Usually Completed It
Return to the 2010s Students Receive ALL Surveys at Once, via or Web Browser Students Must Make Conscious Choice to Visit Site and Complete Survey This Choice Is Repeated for Each Survey Change from Structured Reflection to List of Tasks
Partial Respondents - Defined Not Referring to Those Who FAIL to Complete All Questions on an Individual Survey… – Required Questions Option – Anecdotal and Completely Non-Scientific Analysis …But Students Who Engage Partially in the Process
Why Are Partial Respondents So Important to Response Rates? Assume Some Students Will Never Engage, Short of Mandates or Heavy Incentives Response Rates Matter to Validity and Perception They Are the Easiest Targets
Why Are Partial Respondents Not Responding? – One Research Study Qualitative Survey of 2800 Partial Respondents Quantitative Analysis of Response Rate Patterns Hypotheses – Survey Fatigue (Length and Number of Surveys) – Class Size and Class Level
Qualitative Research - Design Students Who Completed 40% of Assigned Evals, But Not All 667 Responded to Our Survey Request 2 Questions: – What Factor Caused You to NOT Complete All Evals? – What Factor Would Make You MOST LIKELY to Complete All Evals in the Future.
Qualitative Research – Results: Factors for Partial Completion 1.Survey Fatigue (Too Many Surveys) 2.Did Not Care About the Class 3.Instructor Unlikely to View and Incorporate Comments 4.Survey Was Too Long 5.Asked to Complete Same Evaluation Twice -33% -14% -13% -10%
Qualitative Research – Results: Factors for Future Completion 1.Incentives or Credit 2.Instructor Demonstrates He/She Cares about the Results 3.More Time 4.Fewer Evaluations 5.Shorter Evaluations -57% -17% -10% -7%
Quantitative Research - Results Longer Survey Does NOT Correlate With Response Rate (in fact, there is an INVERSE relationship) Greater Number of Surveys Does NOT Correlate with Response Rate Class Size and Class Level Have MIXED Correlations However, We DO Have Quantitative Evidence of Survey Fatigue
Documentation of Survey Fatigue Making the Conscious Choice to Complete the NEXT Survey Calculate Percentage Likelihood of Completion Based on Survey Position Random Sampling of 1,000 Students Probability Relative to Survey Position Greater for Partial Responders
Survey Fatigue - Visualized Likelihood of Completion Position of Survey
The Problem and Short Term Solution Surveys at the Bottom of a List Receive Lower Response Rates Listing Surveys Alphabetically Penalizes Certain Classes or Departments Some Surveys Can Have Severely Lower Response Rates, Based Solely upon Position The Short Term Solution?
Chris Probst to The Rescue Challenge Posed Challenge Answered Build 72 Includes Survey Order Randomization Shares the Pain, Long Term Solutions Needed
What May Be Happening 1.Fatigue Sets in Immediately 2.Fatigue Threshold Is Met After a Particular Survey 3.Length May Affect the Survey Following, NOT a Particularly Long Survey Itself
What We Cant (Havent) Yet Track(ed) Students Saving Progress or Logging off after Completing SOME Surveys, and If Length Is a Factor If a Long Survey Affects Likelihood of Completing the Following One Exactly Where the Breaking Point Occurs
What Seems Likely Survey Fatigue Is a Real Phenomenon for Online Evaluations Survey Fatigue as a Causal Factor for Non- Completion is a New Phenomenon Multiple Incentives Are Best Class Level Incentives Seem to Be Most Effective (for Voluntary Participation)