Computer-marked assessment or learning analytics

Slides:



Advertisements
Similar presentations
Peer dialogues: A new approach to developing feedback practises? Dr. Sarah Richardson Ian Gwinn Sam McGinty.
Advertisements

Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Peer assessment and group work event and practical workshop RSC WM Stimulating and supporting innovation in learning.
The Stage 1 French e-Portfolio Project. Meeting students' linguistic needs/expectations whilst fostering employability awareness. Newcastle University,
USING TECHNOLOGY TO ENGAGE STUDENTS WITH FEEDBACK ON ASSESSED WORK Dr Fiona Handley, Centre for Learning and Teaching.
On-line resources to help students maximise their learning in practical classes Sue R Whittle Sue Bickerdike.
Fiona Russell SASS. Diploma in Housing Studies  Blended learning  Full-time and Part-time students  Postgraduate but includes non-graduates  Assignment.
Teaching and Learning Conference 6/7/10 Using Wikis in University Teaching Ken Clark Economics School of Social Sciences.
Maths Counts Insights into Lesson Study
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
Frances Chetwynd EATING th June 2011 Marking Guides and Effective Feedback.
Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Programme Specification, Benchmarks etc. Warren Houghton School of Engineering and Computer Science, University of Exeter.
Purpose of study A high-quality computing education equips pupils to use computational thinking and creativity to understand and change the world. Computing.
CSA3212: User Adaptive Systems Dr. Christopher Staff Department of Computer Science & AI University of Malta Lecture 9: Intelligent Tutoring Systems.
HEA Conference June 22nd – 23rd 2010 Shaping the Future: Future Learning It’s all in the words: the impact of language on the design and development of.
ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and.
Exploring the learner experience of ePortfolios for formative and summative feedback in the health sciences Susi Peacock (CAP), Sue Murray (CAP) and Alison.
Centre for Academic Practice University of Strathclyde
Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Small is beautiful: an antidote to Big Data Stephen Powell & Sheila MacNeill.
Kesarkar Madhura, Ph.D. Head, Department of Education, SNDTWU, Mumbai.
Assessment for learning: Learning from assessment? Sally Jordan DPS Seminar, 19 th November 2015.
RHIANN WILLIAMSON, SPECIAL INTEREST GROUP RESEARCH Integrating ICT in Formative Assessment.
It’s Good to Talk Alasdair Blair Paper presented to Faculty of Business and Law, ‘Tricks of the Trade’ Seminar, 8 November 2011.
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
Re-engineering Assessment Practices in Scottish Higher Education [REAP] Dr David Nicol, Project Director Centre for Academic Practice and Learning Enhancement.
Models of innovation in learning online Cathy Lewin & Nicola Whitton Education and Social Research Institute Manchester Metropolitan University.
Formative Quizzes and their Contribution to the Understanding of Computer Programming by Level One Students Centre for Open Learning of Mathematics, Science,
The ABLE project: How do we put what we have learnt from learning analytics into practice Tinne De Laet, Head of Tutorial Services, Engineering Science,
Enhancing Students’ Self Regulation on Placement 5 th Annual ESCalate ITE Conference Towards a New Era 15 th May, 2009 Robert Collins Moira Paterson Jane.
Computer Application Mr. Eric Kan Wah Yan College Kowloon 4th March, 2008.
Constructing knowledge using patchwork text assessments with a critical twist (workshop) Dr Alfredo Gaitán Dept of Psychology Thirteenth CRA Annual Residential.
Technology in a science classroom. Promethean Board support and Web-based tools to support student communication.
SAM Learning GO! An Overview for Teachers. SAM Learning GO! Objective Raising attainment throughout the school year by providing student and teacher support.
This resource has been released by the University of Bath as an Open Educational Resource. The materials are licensed under a Creative Commons Attribution-Noncommercial-ShareAlike.
Making the blend work – lessons learned in four years of cross college blended learning Peter Kilcoyne ILT Director Heart of Worcestershire College
Learning Active Citizenship using IPADS, Political Blogs and Social Media Professor Bryony Hoskins, University of Roehampton.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
The Development of a VLE The highs, the lows and the dream! Joy Gilmour Teignmouth Community College.
Being a Canvas Early Implementer Lolita Alfred Lecturer, Faculty of Health and Social Care
Lisa Gray and Paul Bailey Technology to support 21 st century tutoring.
Pedagogy supplants technology to bridge the digital divide. Mat Schencks Lisette Toetenel Institute of Educational Technology and Technology Enhanced Learning,
What about the Assessment System?
No Problem: The Case for Supporting Active Learning through Technology
Analysing student response to feedback in practical chemistry courses
Support for English, maths and ESOL Module 12d: Developing functional mathematics with vocational learners - training the trainers.
Benefits and challenges at The University of Edinburgh IFP
Enabling and engaging students within an e-learning environment using WordPress Background By exploring the development of the principles and practices.
Simon Bedford and Glennys O’Brien
Linda Stewart, Karen King, Mark O’Reilly, Michael Stewart
Assessment and Feedback – Module 1
UWE Bristol Reflection and The Art of Dialogue...
Our Research Into Assessment
Magnus M B Ross & Mary P Welsh
An Introduction to e-Assessment
EDEN Research Workshop October 2014
Analysing NSS results and implications for student engagement
Our Research Into Assessment
Creative assessment and feedback
Caroline Fletcher Innovations in developing teacher assessment literacy: A scholarship circle model Caroline Fletcher
Education & AI High level discussion points relating ‘techniques / tools’ to possible ‘projects’
Metacognition for revision
Welcome to Oldway! Please help yourself to refreshments…
Online peer assessment to promote student engagement
SOCIAL LEARNING analytics
Presentation transcript:

Computer-marked assessment or learning analytics Computer-marked assessment or learning analytics? Sally Jordan Department of Physical Sciences The Open University VICE/PHEC 29th August 2014

Background Definitions of learning analytics e.g. Clow (2013, p. 683): “The analysis and representation of data about learners in order to improve learning”; But assessment is sometimes ignored when learning analytics are discussed. Ellis (2013) points out that assessment is ubiquitous in higher education whilst student interactions in other online environments are not; I will also argue that analysing assessment behaviour also enables us to monitor behaviour at depth; Assessment literature is also relevant e.g. Nicol & Macfarlane-Dick (2006) state that good feedback practice “Provides information to teachers that can be used to shape teaching”.

Analysis of student errors Can look at the individual student or the cohort level; At the individual level, this can form the basis of diagnostic testing; At the cohort level, look for questions that students struggle with; Look at responses in more detail to learn more about the errors that students make; This can give insight into student misunderstandings.

So what topics in Maths for Science do students find difficult?

So what topics in Maths for Science do students find difficult?

Analysis of student responses to individual questions Gives information about student errors, linked to their misconceptions. The confidence in the findings is increased when The questions require a ‘free-text’ (constructed) response; The questions are in summative use (students are trying); Similar errors are seen in different variants and different questions. See Jordan (2014a)

Analysis of engagement …“750 students used my quiz”

When do students do quizzes? (Overall engagement)

When do students do quizzes? (impact of hard deadlines)

When do students do quizzes? (typical patterns of use)

Student engagement with feedback

Student engagement with feedback (identical question) Module A Module B

The future? Redecker, Punie and Ferrari (2012, p. 302) suggest that we should “transcend the testing paradigm”; data collected from student interaction in an online environment offers the possibility to assess students on their actual interactions rather than adding assessment separately.

References Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695. Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662-664. Nicol, D. & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218. Redecker, C., Punie, Y., & Ferrari, A. (2012). eAssessment for 21st Century Learning and Skills. In A. Ravenscroft, S. Lindstaedt, C.D. Kloos & D. Hernandez-Leo (Eds.), 21st Century Learning for 21st Century Skills (pp. 292-305). Berlin: Springer.

For more about what I’ve discussed Jordan, S. (2014a). Adult science learners’ mathematical mistakes: an analysis of student responses to computer-marked questions. European Journal of Science and Mathematics Education, 2(2), 63-87. Jordan, S. (2014b). Using e-assessment to learn about students and learning. International Journal of eAssessment, 4(1) Jordan, S. E., Bolton, J. P. R., Cook, L. J., Datta, S. B., Golding, J. P., Haresnape, J. M., Jordan, R. S., Murphy, K. P. S. J., New, K. J., & Williams, R. T. (in press). Thresholded assessment: Does it work? Report on an eSTEeM Project. Will be available soon from http://www.open.ac.uk/about/teaching-and-learning/esteem/projects/themes/innovative-assessment/thresholded-assessment-does-it-work

Sally Jordan Senior Lecturer and Staff Tutor Head of Physics Department of Physical Sciences The Open University sally.jordan@open.ac.uk blog: http://www.open.ac.uk/blogs/SallyJordan/