Presentation is loading. Please wait.

Presentation is loading. Please wait.

Phil Denton and David McIlroy, Faculty of Science

Similar presentations


Presentation on theme: "Phil Denton and David McIlroy, Faculty of Science"— Presentation transcript:

1 Phil Denton and David McIlroy, Faculty of Science
All by myself: Do students read summative feedback returned to them alone at a computer? Phil Denton and David McIlroy, Faculty of Science

2 Feedback for learning The facilitation of learning is one of the reasons why assessment is undertaken.1 Relies on the return of high-quality feedback. There have been a number of qualitative studies into students’ experiences of tutor feedback. Task-focussed comments are valued.2 General and vague feedback is ineffectual.3 Must show how to address deficiencies.4 Take in account above & return within 15 days…

3 e-Marking Electronic marking tools can expedite marking.
Often have a ‘statement bank’ e.g. Grademark. Staff have reported a positive impact from the increased use of such tools,5 however, their effectiveness merits further research.6 Of particular interest to this study is the experiences of students receiving tutor feedback on assessments via computer. Do they read, let alone act upon, e-feedback?

4 Paper versus Electronic Assessment
Submission Marking Feedback

5 Students’ response to e-feedback
Our previous work suggested a large minority of 1st years did not read feedback on a summative report.7 At the end of ed feedback, students were invited to reply to confirm that they had read it. Group B: % mark hidden and invited to guess it.

6 Quantitative studies of responses to feedback
Our previous study offered no way to differentiate between students who did not read the request for a reply, and those who read it and chose to ignore it. A solution is to place the invite at different positions within feedback and compare response rates. “As part of an educational research project, could I please ask you to immediately reply to this with a blank message. Please do not inform your fellow students that you have done this.”

7 Outcomes of two quantitative studies
Study 1: First year Pharmaceutical Science data analysis exercise. Feedback (~400 words) ed. For 25 students, the request for a reply was at start, adjacent to mark, and 56% of these students replied. 24 Students had the same request in the final paragraph and only 42% of these students replied. % Mark awarded appears to determine response, being 76% for repliers and 57% for non-repliers. Not clear if ‘word of mouth’ affected replies.

8 Table 2 Recipients of summative e-feedback.
Study 2: Formative FHEQ3 Natural Sciences Excel task with brief indicative feedback, followed by ~700 words of e-feedback on associated summative task. Table 2 Recipients of summative e-feedback. Group N request position Mean % Median % T1 27 At start, next to mark 63 65 T2 End of 1st paragraph 64 68 T3 26 End of 2nd paragraph 69 T4 End of 3rd paragraph T5 End of 4th paragraph C None (Control group)

9 Analysis of response rates by Group
Figure 1 Natural Science students (N = 161) who replied to feedback () and who did not ().

10 No students in the control group emailed a reply.
No significant difference in reply rates between T2-5, average 36%: When feedback read, read in full. The mean reply rate for students in T1 (M = 63, SD = 49, N = 27) was significantly higher than T2-5 (M = 36, SD = 48, N = 107) based on a two-sample t-test for equal variances, t(132) = 2.54, p = .012. T1 rate indicates 63% of students will definitely reply upon reading a request. For T2-5, only 36% replied. The difference in these two rates, 27%, did not read the request and, by extension, their feedback.

11 This comparison of response rates gives:
Read, 36%  Eithera, 37%  Not read, 27% Figure 2 Responses of Foundation Natural Science students (N = 161) to feedback on an Excel assignment. aDid not reply to but may have read their feedback.

12 Analysis of response rates by % mark
Over each of the 5 Test Groups, marks of students who replied ranged from 69% to 81%, average 73%. The marks of students who did not reply in each Test group ranged from 53% to 60%, average 57%. The mean actual % mark for repliers (M=73 SD=20, N= 56) was significantly higher than the mean actual % mark for non-repliers (M=57, SD=22, N= 78) using the two-sample t-test for equal variances, t(132) = 4.48, p <

13 Figure 3 % Marks of Natural Science students who replied to their feedback () and who did not ().

14 Conclusions A quarter of 161 FHEQ3 students did not read summative e-feedback. Between three-eighths and three-quarters did read it (from beginning to end). 2 Studies found lower-performing students are less likely to respond to a request for an . Either, they are less likely to read their feedback and see the request, or, they read their feedback as much as their high-performing peers but are less likely to respond upon seeing such a request.

15 Conclusions Alone at a PC is not necessarily a fertile location for learning and we should use strategies that condition students to routinely engage with feedback: Returning formative feedback that is linked to a summative task with same assessment criteria. In-class support after students have received summative feedback (with mark hidden?) online. Around 20 times more time was used to mark the summative Excel task compared to the associated formative exercise – should be reversed?

16 References Orsmond, P., S. Merry and K. Reiling “The use of student derived marking criteria in peer and self-assessment.” Assessment and Evaluation in Higher Education 25(1): 21–38. Black, P., and D. Wiliam “Assessment and Classroom Learning.” Assessment in Education: Principles, Policy and Practice 5(1): 7-74. Weaver, M “Do students value feedback? Students’ perception of tutors’ written responses.” Assessment and Evaluation in Higher Education 31: 379–394. Higgins, R., P. Hartley, and A. Skelton “Getting the message across: The problem of communicating assessment feedback.” Teaching in Higher Education 6(2): 269–274. Heinrich, E., J. Milne, and M. Moore “An Investigation into e-Tool Use for Formative Assignment Assessment – Status and Recommendations.” Educational Technology and Society 12(4): 176–192. Nicol, D. J., and C. Milligan “Rethinking technology-supported assessment in terms of the seven principles of good feedback practice.” In Innovative Assessment in Higher Education, edited by C. Bryan and K. Clegg. London: Taylor and Francis. Denton, P. and Rowe, P “Using statement banks to return online feedback: limitations of the transmission approach in a credit-bearing assessment.” Assessment and Evaluation in Higher Education: pp ISSN


Download ppt "Phil Denton and David McIlroy, Faculty of Science"

Similar presentations


Ads by Google