Can a grade based approach improve our assessment practice?

Slides:



Advertisements
Similar presentations
Experience of using formative assessment and students perception of formative assessment Paul Ong Greg Benfield Margaret Price.
Advertisements

PACT Feedback Rubric Pilot Results with UC Davis English Cohort.
Feedback/feedforward: a view from the Law School Dr Jane Bryan.
LEEDS BECKETT UNIVERSITY NEW STUDENTS: PROGRESSION AND AWARD How do I progress between levels? How is my final award calculated?
LTI CONFERENCE University of Hertfordshire MAY 6 TH 2010 Karen Clark and Sharon Korek.
Improving Students’ understanding of Feedback
© AJC /18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University.
E-portfolios for PDP An overview of student and staff perceptions across subject areas Federica Oradini and Gunter Saunders Online Learning Development.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
How the assessment landscape is changing in the short and medium term Peter Taylor.
From an e-portfolio to a PLS: Integrating an e-portfolio into PgC in Learning & Teaching in HE. Sarah Chesney Centre for the Development Of Learning &
Technical Skills and Tactics in Sport
New Zealand Diploma in Business National External Moderation Reports Tertiary Assessment & Moderation.
ELPP, 15 November 2010 e-Feedback Meeting Students’ Needs & Expectations Yuhua Hu & Paul McLaughlin The School of Biological Sciences.
UCL Arena Exchange Seminar Improving assessment and feedback scores.
Rubrics Staff development workshop 19/9/2014 Dr Ruth Fazakerley.
School of History FACULTY OF ARTS Marking Criteria: the Student Experience Dr Kevin Linch Julia Bowler.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Assessor training 1. Welcome from AUA and NBS Graham Pitcher Why GBA and what is it? The basic matrix format Examples of how the matrices are used at.
Articulating from FE to HE: Assessing & Improving Academic Confidence Enhancement Themes conference, Thursday 9 June 2016 John McIntyre Conference Centre,
FROM BLENDED LEARNING TO BLENDED SURVEYING
What Works at Wolves? A roll out in the Institute of Sport
Getting Prepared for the Webinar
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Nuts and Bolts of Assessment
A Guide to Vocational Subjects Assembly: Wednesday 29th March 2017
Who Goes Where? and why
Charlton Kings Junior School
Leading Enhancement in Assessment and Feedback in Medical Sciences
An investigation of student views on feedback of assessment
Understanding the GCSE Marking Criteria
Research, Reasoning and Rhetoric: Thinking with History: Lecture 6 Understanding marks and feedback 7: Understanding marks and feedback Ted Vallance.
GOAL SETTING.
Now it’s time to write the conclusion paragraph for the recommendation
Dr. Peter Hills, Kara Peterson, Simon Croker, and Dr. Rachel Manning
The use, benefits & pitfalls of self and peer assessment for formative feedback in a large generic nursing module: An example from practice Julia Petty,
A department-wide approach to Feedback
Graduating Excellent Clinicians
Susan Rhind, Neil Lent, Kirsty Hughes, Jill MacKay
Understanding the student journey – from pre-arrival to graduation
What Are Rubrics? Rubrics are components of:
DEVELOPMENTAL LEARNING AND TARGETED TEACHING
Year 6 SATs Information Meeting
Indiana University School of Social Work
Helen Jefferis, Soraya Kouadri & Elaine Thomas
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Jesus Jr. Camposeco ECE 682 Dr. David L. Brown
A department-wide approach to Feedback
Important information about your assessment in 2017/18
Dr Sarah Alix, Deputy Head of Education (Chelmsford)
ASSESSMENT AND MODERATION: IN PRACTICE
Approaches to Teaching and Learning in the MYP: Session 5
Using exemplars to improve students’ assessment literacy & self-efficacy Jeremy Schildt (SSS) - Neil Cooper (PSY) - Gary Parlett (HSC) - Helen James (BIO)
Student Equity Planning August 28, rd Meeting
What do Edinburgh Students Want?
Calibrating Standards in the Assessment of Recitals
Gemma mansi and Hilary Orpin
Developing a Rubric for Assessment
Creative assessment and feedback
Welcome and Induction Event for new External Examiners 2016
BURNSIDE ACADEMY INSPIRES KS2 SATS Guidance for Parents
How will my Degree be Classified
Tutors: providing feedback Students: using tutor feedback
Comments written by Pupils about particular strategies used in English which helped their writing As you will read, some of our pupils commented about.
DVD Back to School Night 2014
Advance HE Surveys Conference
Utilising Canvas to create a collaborative learning environment…..
Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland.
Sue Forsythe, Cathy Smith, Charlotte Webb Mathematics Education
Presentation transcript:

Can a grade based approach improve our assessment practice? Andrew Barnes, Susie Crawford, Sarah Churchill, Pete Olusoga, James Rumbold and Louise Turner Vs

UG Sport and Exercise Science The context UG Sport and Exercise Science Modules of up to 200 students. Lectures/practicals/seminars One and two task modules. Large marking teams (3-4).

Assessment instructions (current practice) Assessment brief (mark scheme) Marking grid Assessment screencast In class tasks (e.g. example lab reports L4)

Current feedback Practice In text annotations/Audio file Highlighted marking grid Overall mark (%) based on weighted criteria. Text/audio - how to improve in subsequent work

The rationale for GBA Consistency: Marking teams of 3-4 of staff assessing work over multiple criteria (SSC and NSS feedback) Flawed assumptions: Using an interval scale to represent student achievement that relies on a qualitative judgement has been suggested to be flawed (Dalzeil 1998). Markers judgement: The use a 100 point scale offers too many possible ways of categorising achievement, for example can we adjudge a 65% as categorically different from a 64%? (Hornby 2003) Level specific criteria: Having different marking criteria between modules prevents students from understanding level specific criteria on which they are judged as they progress through their course (Sadler 2005). Compressed range of marks: Feedback from externals suggest we need to make more use of higher end >75%.

GBA approach for selected L4 assessments What we did GBA approach for selected L4 assessments Not used for exams/phase tests Similar instructions to students Generic grading descriptor (no marking grid) Grade score 0-18

Assessment feedback In text annotations/Audio file Highlighted grade descriptor Grade Score 0-18 (whole assessment) How to move up a grade boundary No criteria specific grades

The dreaded conversion table! Degree class Grade Score Numerical equivalent (%) Indicative mark range (%) First Perfect 1st 18 100 Exceptional 1st 17 96 99 - 93 High 1st 16 89 92 - 85 Mid 1st 15 81 84 - 78 Low 1st 14 74 77 - 70 Upper second High 2.1 13 68 69 - 67 Mid 2.1 12 65 66 - 64 Low 2.1 11 62 63 - 60 Lower second High 2.2 10 58 59 - 57 Mid 2.2 9 55 56 - 54 Low 2.2 8 52 53 - 50 Third High 3rd 7 48 49 - 47 Mid 3rd 6 45 46 - 44 Low 3rd 5 42 43 - 40 Fail Marginal fail 4 38 39 - 35 Mid fail 3 32 34 - 30 Low fail 1-2 29 - 1 Zero

A mixed methods approach from multiple perspectives Evaluation A mixed methods approach from multiple perspectives Student attainment Level 4 students - Online survey (assessment instructions/feedback) Level 5 and 6 - Focus group Staff - Focus group

Student Attainment Laboratory Report 15-16 16-17 GBA Mean 50% 48% 52%   15-16 16-17 GBA Mean 50% 48% 52% 58% StDev 17% 20% 10% 12% Min 16% 18% 13% Max 80% 81% 96% Presentation 61% 56% 57%  58%  9%  11%  27%  20% 22%  89% 72%   81%

L4 Student feedback (results based on 95 responses, 49% of the cohort) Assessment instructions Suggested improvements "Be clear on the difference between grades for marking." "Peer assessment seminar was really helpful in writing my work" "more detail on what markers are looking for in scientific writing such as a lab report" "All assessment information seemed detailed and very clear "Could talk about the university marking criteria more, and maybe be more clear on how our overall grade at the end of the year would be calculated"

L4 Student feedback (results based on 95 responses, 49% of the cohort) Assessment feedback Suggested improvements " Feedback was really detailed and explains why I got the grade I did " "A lot of the feedback only applied to that specific assessment so wasn't very helpful for improving in future assessments" "Be more clearer in weaknesses and strengths, try to provide information on how I can move up a grade boundary please" "Thought the feedback was generally very useful although each point could have been in more detail" "Rather than comments down the side of my work, a detailed break down of how to improve the area would be appreciated. "

L5 and 6 feedback "what is the difference between good, very good and excellent?" "Prefer marking grid as it highlights what is required in each section" "I would be less likely to use the generic grade descriptor than marking grid " "Some lecturers are good at saying how we can improve, while others just repeat annotations in text" " if there are 10-12 people marking, the criteria are still open to interpretation" "I like knowing the % for my coursework so I know exactly how well I have to do in the exam" "I like the idea of grades as I relate % back to it anyway" "The more I moved through university the more I have used the marking grid. In first year I didn't know how to use a marking grid" " GBA would make comparisons a lot easier. Some modules/assessments are perceived as being harder than others"

L5 and 6 student feedback

What did module leaders and tutors make of the changes? Staff feedback What did module leaders and tutors make of the changes? "Converting grades to percentages for SI and displaying grades on blackboard was time-consuming and fiddly" "I prefer the old style marking grid as I felt it was easier to mark the work based on each criteria rather than overall" " I found I was always comparing the grades I gave to the percentage equivalent" " Made it easier to offer explicit feedforward in how to move up a grade boundary" " Found it much to quicker to mark assignments" " Moderation was easier due to the reduced scale" " I was more confident in the final grade awarded than when using a marking grid" " At the end of the day, we are all experienced academics that should be able to make an informed judgement about the quality of a piece of work"

Staff (the verdict!) ` BB Grading schemas (time consuming) Converting marks to % for SI Two tasks (aggregated score) Percentage mindset! More confident in the grade awarded Quicker marking process. Moderation was easier More explicit feedforward

We are rolling GBA into L5 for SES Moving forward We are rolling GBA into L5 for SES Ensure assessment instruction clarity (screencast/in class). Ensure every mark scheme clearly outlines expectation (marking grid info). Develop/adapt assessment specific descriptors (e.g presentation/written piece). Make feedforward points explicit (staff training). GBA mindset! (readiness to change).

Acknowledgments Dr James Rumbold Dr Pete Olusoga Dr Sarah Churchill Dr Susie Crawford Dr Louise Turner

References Dalzeil, J. (1998) Using marks to assess student performance: Some problems and alternatives. Assessment and Evaluation in higher Education, 23, 351-366. Hornby, W. (2003). Assessing using grade-related criteria: A single currency for Universities? Assessment and Evaluation in higher Education, 28, 435-454. Sadler, D. (2005). Interpretations of criteria-based assessment and grading in higher education. Assessment & Evaluation in Higher Education, 30, 175–194.