Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014.

Similar presentations


Presentation on theme: "Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014."— Presentation transcript:

1 Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014

2 What this presentation is about The potential of “assessment analytics”; The use of computer-based assessment as a diagnostic tool (at the individual student level); The analysis of responses to computer-based assessment at the cohort level to provide information about student misunderstandings and student engagement; A consideration of factors that affect student engagement; The future: student engagement as assessment? [using examples from my work]

3 Relevant literature Definitions of learning analytics e.g. Clow (2013, p. 683): “The analysis and representation of data about learners in order to improve learning”; But assessment is sometimes ignored when learning analytics are discussed. Ellis (2013) points out that assessment is ubiquitous in higher education whilst student interactions in other online environments are not; I will also argue that analysing assessment behaviour also enables us to monitor behaviour at depth; Assessment literature is also relevant e.g. Nicol & Macfarlane-Dick (2006) state that good feedback practice “Provides information to teachers that can be used to shape teaching”.

4 Analysis at the individual student level: Diagnostic testing

5 Analysis at the cohort level: Student errors At the most basic – look for questions that students struggle with; Look at responses in more detail to learn more about the errors that students make; This can give insight into student misunderstandings. So what topics in Maths for Science do students find difficult?

6

7 Analysis of student responses to individual questions Gives information about student errors, linked to their misconceptions. The confidence in the findings is increased when The questions require a ‘free-text’ (constructed) response; The questions are in summative use (students are trying); Similar errors are seen in different variants. See Jordan (2014)

8 Why is the answer 243? (instead of 9) 8

9 The question was: Evaluate 3 6/3 Students were evaluating Instead of 3 6/3 = 3 2 = 9

10 For another variant the answer was 5000 instead of 100 The question was: Evaluate 10 4/2 Students were evaluating Instead of 10 4/2 = 10 2 = 100 10

11 Measuring student engagement… “750 students used my iCMA”

12 Measuring student engagement…

13

14 When do students do iCMAs? (overall activity)

15 When do students do iCMAs? (impact of deadlines)

16 When do students do iCMAs (typical patterns of use)

17 Length of responses to short- answer questions

18 Student engagement with feedback

19 Student engagement with feedback (identical question) Module A Module B

20 General conclusions Analysis of student responses to interactive computer-marked questions can give information about student misunderstandings and student engagement with assessment; Generally, students do what they believe their teachers expect them to do; Engagement with computer-marked assessment can act as a proxy for more general engagement with a module (and so act as an early warning if engagement is not as deep as we might wish).

21 The future? Redecker, Punie and Ferrari (2012, p. 302) suggest that we should “transcend the testing paradigm”; data collected from student interaction in an online environment offers the possibility to assess students on their actual interactions rather than adding assessment separately.

22 References Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695. Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662- 664. Nicol, D. & Macfarlane ‐ Dick, D. (2006). Formative assessment and self ‐ regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218. Redecker, C., Punie, Y., & Ferrari, A. (2012). eAssessment for 21st Century Learning and Skills. In A. Ravenscroft, S. Lindstaedt, C.D. Kloos & D. Hernandez-Leo (Eds.), 21st Century Learning for 21st Century Skills (pp. 292-305). Berlin: Springer.

23 For more about what I’ve discussed Jordan, S. (2011). Using interactive computer-based assessment to support beginning distance learners of science, Open Learning, 26(2), 147-164. Jordan, S. (2012). Student engagement with assessment and feedback: Some lessons from short-answer free-text e- assessment questions. Computers & Education, 58(2), 818- 834. Jordan, S. (2013). Using e-assessment to learn about learning. In Proceedings of the 2013 International Computer Assisted Assessment (CAA) Conference, Southampton, 9 th -10 th July 2013. Retrieved from http://caaconference.co.uk/proceedings/ Jordan, S. (2014). Adult science learners’ mathematical mistakes: an analysis of student responses to computer- marked questions. European Journal of Science and Mathematics Education, 2(2), 63-87.

24 Sally Jordan Senior Lecturer and Staff Tutor Deputy Associate Dean, Assessment Faculty of Science The Open University sally.jordan@open.ac.uk blog: http://www.open.ac.uk/blogs/SallyJordan/ sally.jordan@open.ac.ukhttp://www.open.ac.uk/blogs/SallyJordan/


Download ppt "Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014."

Similar presentations


Ads by Google