Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.

Slides:



Advertisements
Similar presentations
Improving student learning by changing assessment across entire programmes Graham Gibbs.
Advertisements

Ko te pae tawhiti whaia kia tata
Directorate of Human Resources Engaging students with assessment and feedback Chris Rust, Head, Oxford Centre for Staff and Learning Development.
Pre-entry qualifications – staff perceptions versus reality Sarah Maguire Derry Corey.
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
School Based Assessment and Reporting Unit Curriculum Directorate
Course Review on Assessment Professor Brenda Smith.
Developing coherence in assessment. Relationship to the new C- QEM report* Coherence in the course and assessment *Course quality enhancement and monitoring.
NSS Assessment and Feedback Scores: Success Stories Serena Bufton.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford
Professional Perspectives: Electronic Engineering Paul Spencer Dean of School, Electronic Engineering Kal Winston* Adviser, Study Skills Centre.
Evaluating Student Work in a Standards-based Framework.
Evaluating Teaching and Learning Linda Carey Centre for Educational Development Queen’s University Belfast 1.
School Improvement Team (SIT)
Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
Aim to provide key guidance on assessment practice and translate this into writing assignments.
Responding to the Assessment Challenges of Large Classes.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Improving Students’ understanding of Feedback
FORMATIVE ASSESSMENT Nisreen Ahmed Wilma Teviotdale.
LIFE SKILLS: ASSESSMENT IN THE THREE STUDY AREAS PERSONAL AND SOCIAL WELL-BEING (PSW) PHYSICAL EDUCATION (PE) CREATIVE ARTS (CA)
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
Oxford Centre for Staff and Learning Development Improving the effectiveness of feedback through increased student engagement Dr Chris Rust Head, Oxford.
Assessment Standards Knowledge exchange Engaging students with assessment and feedback Dr Chris Rust, Deputy Director ASKe CETL Directorate: Margaret Price,
The importance of context in understanding teaching and learning: reflections on thirty five years of pedagogic research Graham Gibbs.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
MODULE 21 TEAMS AND TEAMWORK “Two heads can be better than one” Why is an understanding of teams so important? What are the foundations of successful teamwork?
Principles of Assessment
INFO 637Lecture #31 Software Engineering Process II Launching & Strategy INFO 637 Glenn Booker.
ASSESSMENT Formative, Summative, and Performance-Based
Improving feedback, & student engagement with feedback.
University of York May 2008 Using assessment to support student learning Graham Gibbs.
Developing your Assessment and Feedback Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
If we want to improve our courses, what should we be paying attention to, and how should we go about making changes? Graham Gibbs.
Lesson Planning for Learning Best Practices ~ 2014.
Collection of the Student’s Texts The Collection of the student’s texts promotes student engagement when students:  think about and choose the subject.
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
CERI/OECD “Improving Learning through Formative Assessment” 3 February, 2005.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
The journey towards accrediting schools - the latest situation ACCREDITING TEACHER ASSESSMENT AT KS3.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
INACOL Standard D: CLEAR EXPECTATIONS PROMPT RESPONSES REGULAR FEEDBACK.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
IB: Language and Literature
CHAPTER 4 Assessment: Enhanced Teaching and Learning
Accreditation! The responsive curriculum game is made available through JISC under the terms of the Creative Commons BY-NC-SA Attribution-NonCommercial-ShareAlike.
Improving student learning through assessment and feedback Graham Gibbs University of Winchester.
Junior Cycle Reform at Lower Secondary May Why change? ESRI research - evidence base for junior cycle review 1 st year, 2 nd year, 3 rd year nature.
Compass- Teacher Performance Evaluation Rubric
Efficiency of assessment: initial thinking from the ITEAM project Mark Russell, Liz Gormley-Fleming and Karen Robins The ITEAM project. University of Hertfordshire.
Improving student learning and experience through changing assessment environments at programme level: a practical guide Graham Gibbs.
Rising Sun School District Summative Stocktake. 2 ©2015 U.S. Education Delivery Institute Our district’s primary goal consists of four strategies StrategyOwner.
Creating Assessments that Engage Students & Staff Professor Jon Green.
University of Winchester TESTA/American Studies
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
A department-wide approach to Feedback
A department-wide approach to Feedback
Important information about your assessment in 2017/18
Engaging students with assessment and feedback
Presentation transcript:

Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs

Programme The assessment phenomena TESTA helps to understand: Case study 1 The TESTA conceptual framework The TESTA research tools The TESTA way of working with course teams Case study 2: improvements through TESTA

Bachelors programme Committed and innovative teachers Most marks from coursework, of very varied forms Very few exams Lots of written feedback on assignments: 15,000 words Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment but it does not support learning well. Students: Don’t study many hours and they distribute their effort across few topics and weeks Don’t think there is a lot of feedback or that it is very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are...are unhappy!

TESTA conceptual framework Assessment supports student learning when... It captures sufficient student time and effort, and distributes that effort evenly across topics and weeks – time on task It engages students in high quality learning effort – focussed towards the goals and standards of the course, which students understand – engagement, deep approach It provides enough, high quality feedback, in time to be useful, and that focuses on learning rather than on marks or on the student Students pay attention to the feedback and use it to guide future studying (feedforwards) and to learn to ‘self-supervise’

Changing assessment at module level Gibbs, G. (2009) Using Assessment to Support Student Learning. On line at Leeds Metropolitan University

TESTA research tools Assessment Experience Questionnaire Auditing assessment provision Student focus groups...triangulated data...written up for the course team

1 Assessment Experience Questionnaire (AEQ) –Quantity and distribution of effort –Quality, quantity and timeliness of feedback –Use of feedback –Impact of exams on quality of learning –Quality of study effort: deep and surface approach –Clarity of goals and standards –Appropriateness of assessment

2 Auditing assessment environments % marks from examinations (or coursework) Volume of summative assessment Volume of formative only assessment Volume of (formal) oral feedback Volume of written feedback Timeliness: days after submission before feedback provided

Range of characteristics of programme level assessment environments % marks from exams: 0% - 100% number of times work marked: 11 – 95 variety of assessment: number of times formative-only assessment: 0 – 134 number of hours of oral feedback: 1 – 68 number of words of written feedback: 2,700 – 15,412 turn-round time for feedback: 1 day – 28 days

Patterns of assessment features within programmes every programme that has much summative assessment has little formative assessment, and vice versa no examples of much summative assessment and much formative assessment there may be enough resources to mark student work many times, or to give feedback many times, but not enough resources to do both...it is clear which leads to most learning

Relationships between assessment characteristics and student learning …wide variety of assessment, much summative assessment, little formative-only assessment, little oral feedback and slow feedback are all associated with a negative learning experience: less of a deep approach less coverage of the syllabus less clarity about goals and standards less use of feedback

Relationships between assessment characteristics and student learning Much formative-only assessment, much oral feedback and prompt feedback are all associated with a positive learning experience: more effort more deep approach more coverage of the syllabus greater clarity about goals more use of feedback more overall satisfaction… …even when they are also associated with lack of explicitness of criteria and standards, lack of alignment of goals and assessment and a narrow range of assessment.

TESTA way of working with course teams Only work with whole teams Meet programme leader to negotiate practicalities and interview as part of audit ‘Before’ TESTA data collection and report Discussion of report in whole team: 2-3 hours. Collegial decision making about interpretation and action. Follow-up communication with programme leader about rationale of changes and timing of ‘after’ data collection ‘After’ data collection and report on changes Discussion of impact of changes with whole team and consideration of longer/larger changes to assessment. Collegial decision making

Case study 1: what is going on? Lots of coursework, of very varied forms (lots of innovation) Very few exams Masses of written feedback on assignments Four week turn-round of feedback Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment But students: Don’t put in a lot of effort and distribute their effort across few topics Don’t think there is a lot of feedback or that it very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are

TESTA diagnosis All assignments are marked and they are all students spend any time on. Not possible to mark enough assignments to keep students busy. No exams or other unpredictable demands to spread effort across topics. Almost no required formative assessment Teachers all assessing something interestingly different but this utterly confuses students. Far too much variety and no consistency between teachers about what criteria mean or what standards are being applied. Students never get better at anything because they don’t get enough practice at each thing. Feedback is no use to students as the next assignment is completely different. Four weeks is much too slow for feedback to be useful, and results in students focussing on marks.

Case study 1: what to change? Reduce variety of assignments and plan progression across three years for each type, with invariant criteria, and many exemplars of different quality, for each type of assignment Increase formative assessment: dry runs at what will later be marked, sampling for marking. Reduce summative assessment: one per module is enough (24 in three years) or longer/bigger modules Separate formative assessment and feedback from summative assessment and marks … give feedback quickly, marks later, or feedback on drafts and marks on final submission Teachers to accept that the whole is currently less than the sum of the parts (and current feedback effort is largely wasted) and give up some autonomy within modules for the sake of students’ overall experience of the programme – so teachers’ effort is worthwhile

Case Study 2: TESTA Impact whole programme shift from summative to formative assessment linking a narrower variety of types of assignment across modules so feedback can feed in to future modules new ‘core team’ to handle increased face to face feedback parallel changes by individual teachers: e.g. self-assessment reduction in exams, abandoned most MCQs ‘fast track’ approval of changes (discussion with PVCs) Impact: AEQ scores moving in the right direction for: Quantity and quality of feedback Use of feedback Appropriate assessment Clear goals and standards Surface approach (less of it!) Overall satisfaction

TESTA Transforming the Experience of Students Through Assessment Tansy Jessop Yassein El-Hakim –Video lecture on rationale –Research tools and how to use them and interpret data