University of York May 2008 Using assessment to support student learning Graham Gibbs.

Slides:



Advertisements
Similar presentations
Improving student learning by changing assessment across entire programmes Graham Gibbs.
Advertisements

Action Learning: Some principles
Directorate of Human Resources Examples of blended course designs Oxford Centre for Staff and Learning Development
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
FEEDBACK! WHATS FEEDBACK? Did you know that..... Feedback you receive is not just confined to coursework and formal assessments. It will not always come.
Perceptions of Feedback: Myth & Reality Jon Scott School of Biological Sciences.
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
FASS Disability Initiative Seminar Two: Curriculum and Course Design Dr Leanne Dowse (SSIS) and Dr Brooke Dining.
Course Review on Assessment Professor Brenda Smith.
Developing coherence in assessment. Relationship to the new C- QEM report* Coherence in the course and assessment *Course quality enhancement and monitoring.
Quality Matters! Using the Quality Matters Rubric to Improve Online Course Design Susan Bussmann and Sandy Johnson NMSU Quality Matters Institutional.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Curriculum Design for Assuring Learning in Business Education
Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford
Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Improving Students’ understanding of Feedback
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
Formative Assessment in Science Teaching Feedback can be a waste of time Stephen J Swithenby The Open University, Milton Keynes MK7 6AA
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
Oxford Centre for Staff and Learning Development Developing a variety of assessment methods Chris Rust Oxford Centre for Staff and Learning Development.
Supporting Transition: Enhancing Assessment Feedback in First Year Using Digital Technologies.
Ursula Wingate Department of Education and Professional Studies Embedding writing instruction into subject teaching – how to convince subject teachers?
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
© Curriculum Foundation1 Section 2 The nature of the assessment task Section 2 The nature of the assessment task There are three key questions: What are.
Business and Management Research
ANGELA SHORT SCHOOL OF BUSINESS AND HUMANITIES KEVIN STARRS, SCHOOL OF ENGINEERING(RETIRED!) Designing and Delivering an online module.
‘I've been studying for 15 years and so know some things!’: Some adventures in trying to make a study skills programme relevant. Stephen Thornton School.
Jeremy Hall Nicholas Jones Wouter Poortinga An Exploration of Assessment Practices at Cardiff University’s Schools of Engineering, Psychology and the Centre.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Using formative assessment. Aims of the session This session is intended to help us to consider: the reasons for assessment; the differences between formative.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
Developing your Assessment and Feedback Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.
Engaging students in assessment Chris Rust Deputy Director, ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Designing in and designing out: strategies for deterring student plagiarism through course and task design Jude Carroll, Oxford Brookes University 22 April.
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
Evaluation of Respondus assessment tool as a means of delivering summative assessments A KU SADRAS Project Student Academic Development Research Associate.
Assessment Lecture 3. Chain of control Assessment which results in monitoring a learner’s achievements during his/her programme of study forms an essential.
Preparing students as effective assessors Enabling learning beyond graduation David Nicol Professor of Higher Education Centre for Academic Practice and.
Lack of Learning or Lack of Studying? An Inquiry into Low Exam Scores Katherine M. Sauer Metropolitan State College of Denver February.
HEA Conference June 22nd – 23rd 2010 Shaping the Future: Future Learning It’s all in the words: the impact of language on the design and development of.
Exam Taking Kinds of Tests and Test Taking Strategies.
Formative Assessment in Flanders Second Chance Learning in Hoboken.
Jason Leman Education Researcher Sheffield Hallam University.
PRESENTER: N. LEACH TRANSFORMING ASSESSMENTS An investigation into the assessment strategy for Commercial Law for Accountants 1.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Improving Academic Feedback - Turning the ship around with the power of 1000 students David Hope
Fair and Appropriate Grading
+ Summer Institute for Online Course Development Institute – Assessment Techniques Presentation by Nancy Harris Dept of Computer Science.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Why bother giving feedback?. How not to provide feedback?
GCSE English Language 8700 GCSE English Literature 8702 A two year course focused on the development of skills in reading, writing and speaking and listening.
Improving student learning through assessment and feedback Graham Gibbs University of Winchester.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
Improving student learning and experience through changing assessment environments at programme level: a practical guide Graham Gibbs.
A MEMBER OF THE RUSSELL GROUP. Denis Duret School of Veterinary Science University of Liverpool Denis.
AP Lang by the Numbers. Scoring Systems -When we talk about scores, there are two separate scoring systems that matter to you. What is my grade in class?
Creating Assessments that Engage Students & Staff Professor Jon Green.
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
University of Winchester TESTA/American Studies
The Impact of Peer Learning on Assessment Literacy and Feedback Orientation
Presentation transcript:

University of York May 2008 Using assessment to support student learning Graham Gibbs

Personal Background Open University and –Top in National Student Survey, especially assessment and feedback ratings Oxford Brookes University –Most ‘coursework’ assessment –Systematic course design University of Oxford –Least ‘coursework’ assessment –Top in National Student Survey, especially assessment and feedback ratings –‘No course design’

Personal Background Practical books and articles about assessment ‘53 Interesting ways to assess your students’ ‘Assessing Student Centred Courses’ ‘Assessing Large Classes’ Consultancy to universities on strategic decisions about assessment policy Research into the impact of assessment on student learning

Student experience of assessment

“I just don’t bother doing the homework now. I approach the courses so I can get an ‘A’ in the easiest manner, and its amazing how little work you have to do if you really don’t like the course.”

“I am positive there is an examination game. You don’t learn certain facts, for instance, you don’t take the whole course, you go and look at the examination papers and you say ‘looks as though there have been four questions on a certain theme this year, last year the professor said that the examination would be much the same as before’, so you excise a good bit of the course immediately…”

“The feedback on my assignments comes back so slowly that we are already on the topic after next and I’ve already submitted the next assignment. I just look at the mark and throw it in the bin”

“ The tutor likes to see the right answer circled in red at the bottom of the problem sheet. He likes to think you’ve got it right first time. You don’t include any workings or corrections – you make it look perfect. The trouble is when you go back to it later you can’t work out how you did it and you make the same mistakes all over again”

“One course I tried to understand the material and failed the exam. When I took the resit I just concentrated on passing and got 98%. My tutor couldn’t understand how I failed the first time. I still don’t understand the subject so it defeated the object, in a way”

“I do not like the on-line assessment method…it was too easy to only study to answer the questions and still get a good mark … the wrong reasoning can still result in the right answer so the student can be misled into thinking she understands something … I think there should have been a tutor-marked assessment part way through the course so someone could comment on methods of working, layout etc.”

“We were told this course was going to be an opportunity to be creative, to take risks. Then in week five we were hit with a multiple choice question test and we realised what it was really all about.”

Summative assessment that is redundant Most students can, in their first year, predict their final results with some accuracy As few as 5% of assessments are necessary to produce the same overall grades

Assessment that improves learning The case of the Engineer The case of the Manager The case of the Pharmacist The case of the Psychologist The case of the Accountant

The case of the engineer Weekly lectures, problem sheets and classes Marking impossible Problem classes large enough to hide in Students didn’t tackle the problems Exam marks: 45%

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged Exam marks increased from 45% to 85% Why did it work?

The case of the engineer time on task social learning and peer pressure timely and influential feedback learning by assessing –error spotting –developing judgement (internalisation of standards) –self-supervision (meta-cognitive awareness)

Assessment that improves learning The case of the Engineer The case of the Manager The case of the Pharmacist The case of the Psychologist The case of the Accountant

“Conditions under which assessment supports student learning”

Quantity and distribution of student effort 1 Assessed tasks capture sufficient student time and effort 2These tasks distribute student effort evenly across topics and weeks

Quality and level of student effort 3These tasks engage students in productive learning activity 4Assessment communicates clear and high expectations to students

Quantity and timing of feedback 5Sufficient feedback is provided, both often enough and in enough detail 6The feedback is provided quickly enough to be useful to students

Quality of feedback 7Feedback focuses on learning rather than on marks or students themselves 8Feedback is understandable to students, given their sophistication

Student response to feedback 9 Feedback is received by students and attended to, and is acted upon by students to improve their work or their learning

Effective assessment tactics Bioscience: poster reports Engineering: sampling lab reports + cheap feedback Law: essay requirements + sampling + models Estates management: project exams French Literature: critiquing texts under examination conditions

Assessment Experience Questionnaire Measures extent to which the ‘conditions’ are perceived to be met –Quantity and distribution of effort –Quality, quantity and timeliness of feedback –Use of feedback –Impact of exams on quality of learning –Deep approach –Surface approach –Clarity of goals and standards –Appropriateness of assessment

‘University A’ is the Open University 8 assignments per course Detailed written feedback on every assignment Quality assurance of feedback Less funding per student than any other university, mainly spent on feedback Best student ratings nationally Much pedagogic research Formative-only early assignments improve retention Computer-based assignments reduce retention and performance

…then I went to Oxford Oxford responds in a limited way to most national quality assurance guidelines –learning outcomes –assessment criteria –alignment of assessment with aims Oxford has not ‘modernised’ its assessment –reliance on examinations, little assessed coursework, little summative assessment of any kind, no modularisation

…then I went to Oxford Oxford responds in a limited way to most national quality assurance guidelines –learning outcomes –assessment criteria –alignment of assessment with aims Oxford has not ‘modernised’ its assessment –reliance on examinations, little assessed coursework, little assessment of any kind, no modularisation Outstanding quality of student experience at Oxford –student retention of 98% (1 st in UK) –Oxford ranked 1 st for teaching in UK (Times, Guardian) –better CEQ scores than elsewhere in world –better NSS ratings than the Open University

Research questions What are the characteristics of programme level assessment environments that are associated with positive student learning responses? Are the characteristics of programme level assessment environments that are most closely associated with positive student learning responses those that quality assurance regulations emphasise?

Research design Three contrasting universities (Oxford, pre-1992, post 1992) Three contrasting programmes in each (Humanities, Science, Applied Social Science) Characterise assessment environments –Read documentation (all modules) –Interview programme leader, lecturers and students Administer AEQ Explore relationships between characteristics of programme level assessment design and qualities of student learning -with Harriet Dunbar-Goddet, Chris Rust and Sue Law -funded by the Higher Education Academy

Coding characteristics of programme level assessment environments % marks from examinations Volume of summative assessment Volume of formative only assessment Volume of (formal) oral feedback Volume of written feedback Timeliness: days after submission before feedback provided Explicitness of criteria and standards Alignment of goals and assessment

Coding characteristics of programme level assessment environments % marks from examinations –High: more than 70% –Med: between 40% and 70% –Low: less than 40%

Coding characteristics of programme level assessment environments Explicitness of criteria and standards –High: clear criteria for most assignments & exams; link to grades; effort made to enable students to internalise criteria & standards –Low: explicit criteria and standards rare and/or nebulously formulated; marks/grades arrived at through global judgment in tacit way; no effort to enable students to internalise criteria and standards

Range of characteristics of programme level assessment environments

% marks from exams: 17% - 100%

Range of characteristics of programme level assessment environments % marks from exams: 17% - 100% number of times work marked:

Range of characteristics of programme level assessment environments % marks from exams: 17% - 100% number of times work marked: number of times formative-only assessment:

Range of characteristics of programme level assessment environments % marks from exams: 17% - 100% number of times work marked: number of times formative-only assessment: number of hours of oral feedback:

Institutional assessment environments Aspect of assessment OxbridgePost Pre-1992 Volume/frequency of formative only assessment HighLowMed % marks from examsHighLowMed % marks from coursework LowHighMed Alignment of learning activity with assessment LowHighMed Explicitness of goals/outcomes and criteria LowHighLow

Patterns of assessment features within programmes every programme that is high on the volume of formative assessment is low on the volume of summative assessment no examples of high volume of summative assessment and high volume of feedback

Patterns of assessment features within programmes every programme that is low on the volume of summative assessment is high on the volume of formative assessment no examples of high volume of summative assessment and high volume of feedback there may be enough resources to mark student work many times, or to give feedback many times, but not enough resources to do both

Relationships between assessment characteristics and student learning

Assessment characteristics and student learning response: 1 When the level of explicitness of criteria and standards is high, students’ experience is characterised by: Less coverage of the syllabus Less and poorer quality feedback Less use of feedback Less learning from the examination Less deep approach

Assessment characteristics and student learning response: 2 When the level of alignment of goals and standards is high, students’ experience is characterised by: Less coverage of the syllabus Less and poorer quality feedback Less use of feedback Less appropriate assessment Less clear goals and standards Less learning from the examination Less deep approach

Assessment characteristics and student learning response: 3 When the level variety of assessment methods is high, students’ experience is characterised by: Less and poorer quality feedback Less use of feedback Less appropriate assessment Less clear goals and standards Less learning from the examination Less deep approach More surface approach Less overall satisfaction

Assessment characteristics and student learning response: 4 When the volume of formative-only assessment is high, students’ experience is characterised by: More coverage of the syllabus More and better quality feedback More use of feedback More appropriate assessment More clear goals and standards More learning from the examination More deep approach More overall satisfaction

Assessment characteristics and student learning response: 5 When the volume of oral feedback is high, students’ experience is characterised by: More coverage of the syllabus More and better quality feedback More use of feedback More appropriate assessment More clear goals and standards More learning from the examination More deep approach More overall satisfaction

Assessment characteristics and student learning response: 6 When the timeliness of feedback is high, students’ experience is characterised by: More effort More coverage of the syllabus More and better quality feedback More use of feedback More appropriate assessment More clear goals and standards More learning from the examination

Summary Explicitness of criteria and standards, alignment of goals and assessment and variety of assessment are all associated with a negative learning experience …they are also associated with more summative and less formative-only assessment, less oral feedback and less prompt feedback

Summary Explicitness of criteria and standards, alignment of goals and assessment and variety of assessment are all associated with a negative learning experience …they are also associated with more summative and less formative-only assessment, less oral feedback and less prompt feedback Formative only assessment, oral feedback and prompt feedback are all associated with positive learning experience …even when they are also associated with lack of explicitness of criteria and standards, lack of alignment of goals and assessment and a narrow range of assessment.

Why? being explicit does not result in students being clear about what they are supposed to be doing or what counts as high quality ‘legitimate peripheral engagement in a community of practice’ (Lave and Wenger; Price et al)

Why? Students experience very varied forms of assessment as confusing: ambiguity and anxiety are associated with a surface approach Feedback improves learning most when there are no marks Possible to turn feedback round quickly when there are no QA worries about marks

…alternative explanation A The features of assessment environments identified here that appear to have negative consequences for student learning are also the features that are associated with modular courses in which each separate module has to have ‘self-contained’ assessment within a short time frame. Conclusion It may be modularity, rather than QA regimes, that have caused some of the problems. Oxbridge is not modular (the Open University is…but has huge and long modules that are usually studied one at a time)

…alternative explanation B High volumes of assessed ‘coursework’ have been introduced in part to increase student engagement Student engagement improves learning outcomes High % of marks from coursework is associated with higher marks and better degrees (at post ’92 universities) However… The effect of innovations that enhance engagement on learning outcomes only holds for low ability students (high ability students engage themselves) (Carini et al, in press) The present study did not control for student ability

Conclusions Assessment has more impact on how students go about studying, on the quantity and quality of their effort, and on their performance – than does teaching It is relatively easy (and often cheap) to change student learning by changing assessment, provided the ‘conditions’ are met effectively Whole universities have implicit conventions about what is ‘acceptable’ in terms of assessment practice Some of these conventions are ill-informed and damaging … and are built in to QA systems Local contexts are likely to require different assessment strategies to engage their students.

References Carini, R.M., Kuh, G.D. & Klein, S.P. (in press) Student engagement and student learning: testing the outcomes. Research in Higher Education Dunbar-Goddet, H. & Gibbs, G. (under review) A methodology for evaluating the effects of programme assessment environments on student learning. European Association for Research into Learning and Instruction, Assessment Conference, Northumbria. Gibbs, G. (2002) Evaluation of the impact of formative assessment on student learning behaviour. European Association for Research into Learning and Instruction. Newcastle: Northumbria University. August Gibbs, G. & Simpson, C. (2003) Measuring the response of students to assessment: the Assessment Experience Questionnaire. 11 th International Improving Student Learning Symposium, Hinckley. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports student learning. Learning and Teaching in Higher Education 1, pp3-31. Gibbs, G., Simpson, C. & Macdonald, R. (2003) Improving student learning through changing assessment – a conceptual and practical framework. European Association for Research into Learning and Instruction Conference, Padova, Italy.