Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014.

Slides:



Advertisements
Similar presentations
Dr David Nicol Project Director Centre for Academic Practice University of Strathclyde Re-engineering Assessment Practices in Scottish.
Advertisements

Directorate of Human Resources Examples of blended course designs Oxford Centre for Staff and Learning Development
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
The perceived added value of peer marking in formative assessment: a cross-disciplinary study Tom Bartlett ENCAP Paul Crosthwaite ENCAP Helen Jones BIOSI.
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Assessment for learning In a Sport Culture class.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Peer assessment and group work event and practical workshop RSC WM Stimulating and supporting innovation in learning.
Collecting and Analyzing Data Presented By: Dana Sirotiak Dave Vadas 10/17/2012.
USING TECHNOLOGY TO ENGAGE STUDENTS WITH FEEDBACK ON ASSESSED WORK Dr Fiona Handley, Centre for Learning and Teaching.
Fiona Russell SASS. Diploma in Housing Studies  Blended learning  Full-time and Part-time students  Postgraduate but includes non-graduates  Assignment.
Peer assessment of group work Paul Chin
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
Investigating the use of short answer free-text questions in online interactive assessment Sally Jordan 17 th January 2008 Centre for Open Learning of.
Tom Carroll & Kieran Mulchrone (UCC) Áine Ní Shé & Julie Crowley (CIT) University of Newcastle  Main project aim: Leverage potential of NUMBAS.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
Maths Counts Insights into Lesson Study 1. Teacher: Olivia Kelly SHS Maths department Class: First year Maths Ability:Mixed 2.
BY CONNER LANGREHR BENEFITS OF TECHNOLOGY USE. USE OF TECH IN THE CLASSROOM The usage of powerful new technologies in the classroom is the direction that.
Maths Counts Insights into Lesson Study 1. Tim Page and Joanne McBreen Transition Year or Senior Cycle Introducing Tolerance and Error (Leaving cert.
Learning Analytics and Assessment Analytics and Feed Forward.
Module 1 – Unit 2 Applying ICT in MEIS The Didactic Approach.
Strand 1 Improving the Quality of Teaching with a Focus on English and Maths and Employability Skills Commissioned and funded byOrganised by.
Improving computer-marked assessment: How far can we go? How far should we go? Sally Jordan, Department of Physical Sciences, The Open University,
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
Formative thresholded assessment: Evaluation of a faculty-wide change in assessment practice Sally Jordan and Janet Haresnape Faculty of Science The Open.
ASSESSMENT IN ONLINE ENVIRONMENTS. WELCOME o Facilitator name Position at university Contact info.
Interactive online assessment with teaching feedback – for open learners Valda Stevens and Sally Jordan Student Feedback and Assessment Meeting, 10 th.
Exploring ways of using formative feedback to improve student engagement with simulation modules. Vicky Thirlaway Amy Musgrove.
Maintaining student engagement with formative assessment (using Questionmark) Tracey Wilkinson Centre for Biomedical Sciences Education School of Medicine,
Jason Cole Consultant As presented at the Sakai Summer Conference 12 June 2007 | Amsterdam, Netherlands The public face of eLearning.
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Purpose of study A high-quality computing education equips pupils to use computational thinking and creativity to understand and change the world. Computing.
NCETM CPD Conference Engaging with Mathematics: A journey for teachers, learners and families Nottingham, Tue. 1 st Dec Improving the Success Rate.
HEA Conference June 22nd – 23rd 2010 Shaping the Future: Future Learning It’s all in the words: the impact of language on the design and development of.
etools.massey.ac.nz Tools to help lecturers mark assignments John Milne Eva Heinrich.
Exploring the learner experience of ePortfolios for formative and summative feedback in the health sciences Susi Peacock (CAP), Sue Murray (CAP) and Alison.
Centre for Academic Practice University of Strathclyde
1 Power, Passion, Rapport and Reflexivity: Political and Personal Implications of Biographically Situated Research(ers) Cec Pedersen Faculty of Business.
Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Advancing Assessment Literacy Setting the Stage II: Understanding Data Purposes & Uses.
CFN 204 · Diane Foley · Network Leader CMP3 Professional Development Presented by: Simi Minhas Math Achievement Coach CFN204 1.
Know your audience; the information literacy (IL) levels of medical students arriving at Imperial College London Bethan Adams, 1 Richard Birley, 2 and.
Small is beautiful: an antidote to Big Data Stephen Powell & Sheila MacNeill.
Analyze Design Develop AssessmentImplement Evaluate.
“Developing Faculty Capabilities to Support Integrative Learning A Session Led by: L. Dee Fink, Ph.D. International Consultant in Higher Education AAC&U.
Assessment for learning: Learning from assessment? Sally Jordan DPS Seminar, 19 th November 2015.
Data Analysis Protocols: An Overview
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
Re-engineering Assessment Practices in Scottish Higher Education [REAP] Dr David Nicol, Project Director Centre for Academic Practice and Learning Enhancement.
The ABLE project: How do we put what we have learnt from learning analytics into practice Tinne De Laet, Head of Tutorial Services, Engineering Science,
Enhancing Students’ Self Regulation on Placement 5 th Annual ESCalate ITE Conference Towards a New Era 15 th May, 2009 Robert Collins Moira Paterson Jane.
ASSESSMENT TO IMPROVE SELF REGULATED LEARNING 5 th July 2006, 10 th CAA conference, Poppy Pickard Assessment to improve Self Regulated Learning.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
SAM Learning GO! An Overview for Teachers. SAM Learning GO! Objective Raising attainment throughout the school year by providing student and teacher support.
This resource has been released by the University of Bath as an Open Educational Resource. The materials are licensed under a Creative Commons Attribution-Noncommercial-ShareAlike.
Learning Active Citizenship using IPADS, Political Blogs and Social Media Professor Bryony Hoskins, University of Roehampton.
Innovative Applications of Formative Assessments in Higher Education Beyond Exams Dan Thompson M.S. & Brandy Close M.S. Oklahoma State University Center.
Being a Canvas Early Implementer Lolita Alfred Lecturer, Faculty of Health and Social Care
Lisa Gray and Paul Bailey Technology to support 21 st century tutoring.
Course Director’s Strategy Day
No Problem: The Case for Supporting Active Learning through Technology
Feedback hints and tips
Our Research Into Assessment
How can Blackboard assist in Assessment and Facilitation of Knowledge Exchange? Anne Nortcliffe.
An Introduction to e-Assessment
EDEN Research Workshop October 2014
Computer-marked assessment or learning analytics
Our Research Into Assessment
Presentation transcript:

Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014

What this presentation is about The potential of “assessment analytics”; The use of computer-based assessment as a diagnostic tool (at the individual student level); The analysis of responses to computer-based assessment at the cohort level to provide information about student misunderstandings and student engagement; A consideration of factors that affect student engagement; The future: student engagement as assessment? [using examples from my work]

Relevant literature Definitions of learning analytics e.g. Clow (2013, p. 683): “The analysis and representation of data about learners in order to improve learning”; But assessment is sometimes ignored when learning analytics are discussed. Ellis (2013) points out that assessment is ubiquitous in higher education whilst student interactions in other online environments are not; I will also argue that analysing assessment behaviour also enables us to monitor behaviour at depth; Assessment literature is also relevant e.g. Nicol & Macfarlane-Dick (2006) state that good feedback practice “Provides information to teachers that can be used to shape teaching”.

Analysis at the individual student level: Diagnostic testing

Analysis at the cohort level: Student errors At the most basic – look for questions that students struggle with; Look at responses in more detail to learn more about the errors that students make; This can give insight into student misunderstandings. So what topics in Maths for Science do students find difficult?

Analysis of student responses to individual questions Gives information about student errors, linked to their misconceptions. The confidence in the findings is increased when The questions require a ‘free-text’ (constructed) response; The questions are in summative use (students are trying); Similar errors are seen in different variants. See Jordan (2014)

Why is the answer 243? (instead of 9) 8

The question was: Evaluate 3 6/3 Students were evaluating Instead of 3 6/3 = 3 2 = 9

For another variant the answer was 5000 instead of 100 The question was: Evaluate 10 4/2 Students were evaluating Instead of 10 4/2 = 10 2 =

Measuring student engagement… “750 students used my iCMA”

Measuring student engagement…

When do students do iCMAs? (overall activity)

When do students do iCMAs? (impact of deadlines)

When do students do iCMAs (typical patterns of use)

Length of responses to short- answer questions

Student engagement with feedback

Student engagement with feedback (identical question) Module A Module B

General conclusions Analysis of student responses to interactive computer-marked questions can give information about student misunderstandings and student engagement with assessment; Generally, students do what they believe their teachers expect them to do; Engagement with computer-marked assessment can act as a proxy for more general engagement with a module (and so act as an early warning if engagement is not as deep as we might wish).

The future? Redecker, Punie and Ferrari (2012, p. 302) suggest that we should “transcend the testing paradigm”; data collected from student interaction in an online environment offers the possibility to assess students on their actual interactions rather than adding assessment separately.

References Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), Nicol, D. & Macfarlane ‐ Dick, D. (2006). Formative assessment and self ‐ regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), Redecker, C., Punie, Y., & Ferrari, A. (2012). eAssessment for 21st Century Learning and Skills. In A. Ravenscroft, S. Lindstaedt, C.D. Kloos & D. Hernandez-Leo (Eds.), 21st Century Learning for 21st Century Skills (pp ). Berlin: Springer.

For more about what I’ve discussed Jordan, S. (2011). Using interactive computer-based assessment to support beginning distance learners of science, Open Learning, 26(2), Jordan, S. (2012). Student engagement with assessment and feedback: Some lessons from short-answer free-text e- assessment questions. Computers & Education, 58(2), Jordan, S. (2013). Using e-assessment to learn about learning. In Proceedings of the 2013 International Computer Assisted Assessment (CAA) Conference, Southampton, 9 th -10 th July Retrieved from Jordan, S. (2014). Adult science learners’ mathematical mistakes: an analysis of student responses to computer- marked questions. European Journal of Science and Mathematics Education, 2(2),

Sally Jordan Senior Lecturer and Staff Tutor Deputy Associate Dean, Assessment Faculty of Science The Open University blog: