From student to lecturer: 20 years of research on assessment as a timeline Lin Norton & Bill Norton Liverpool Hope University 1.

Slides:



Advertisements
Similar presentations
EU Presidency Conference Effective policies for the development of competencies of youth in Europe Warsaw, November 2011 Improving basic skills in.
Advertisements

Erasmus Mundus Information Day 20 January Erasmus Mundus Information Day 20 January ERASMUS MUNDUS PREPARING YOUR APPLICATION.
Peer-Assessment. students comment on and judge their colleagues work.
NCATS REDESIGN METHODOLOGY A Menu of Redesign Options Six Models for Course Redesign Five Principles of Successful Course Redesign Four Models for Assessing.
Classroom Factors PISA/PIRLS Task Force International Reading Association January 2005.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Stage One: Registrant Mentor, (N.M.C., 2006).
How Students learn Mike Cook.
Experience of using formative assessment and students perception of formative assessment Paul Ong Greg Benfield Margaret Price.
The Ethical Student: Enhancing the Teaching of Ethics in the Undergraduate Curriculum Funded by the Learning and Teaching Institute, University of Chester.
Quality Control in Evaluation and Assessment
Supporting managers: assessment and the learner journey
Small scale project workshop September 2008 Ivan Moore Director CPLA Centre for Excellence in Teaching and Learning (Promoting Learner Autonomy)
Peer peer-assessment & peer- feedback
I3: Inquiry, Independence and Information MBA Suite Media City 19 th April 2012.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Enhancing student learning through assessment: a school-wide approach Christine OLeary & Kiefer Lee Sheffield Business School.
Opinions of 4th-year students of the Faculty of Law, University of Zagreb, on the improvement of studying and the reform of the educational system Prof.
Academic assessment of work placement – made easy?
How Key is Commitment? – An Outline Proposal Charlotte J Young, Bournemouth University, UK 2 nd July 2009.
Developing an effective assessment strategy Peter Hartley, Professor of Education Development University of Bradford
Correction, feedback and assessment: Their role in learning
Peer dialogues: A new approach to developing feedback practises? Dr. Sarah Richardson Ian Gwinn Sam McGinty.
Principles of Assessment and Feedback for Learning CHEP Strategic Work-stream Assessment and Feedback for Learning Dr Alan Masson.
The Framework for Teaching Charlotte Danielson
Overview for Parents and Guardians Fall 2010
Great idea but just no time: Teachers views of research and its role in their professional lives Simon Borg Centre for Language Education Research School.
Southwood School: A Case Study in Training and Development
Effectively applying ISO9001:2000 clauses 6 and 7.
The LLM Pre-sessional course at the University of Southampton Dr Liz Hauge, Centre for Language Study Prof Natalie Lee, Head of the School of Law.
1 ACADEMIC INFRASTRUCTURE WHAT IT MEANS. 2 ORIGINS Dearing report 1997 Dearing report 1997 Proposals: Proposals: framework for qualifications and awards.
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
The Intentional Teacher
BTEC Progression The facts Vocational Presentation Lorne McNeill.
James Richards, William Angliss Institute An old chestnut revisited: teachers’ opinions and attitudes toward grading within a competency based training.
Strengthening the Problem-Solving Abilities of Our Students Emma Carberry, School of Mathematics and Statistics.
Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
MYP planning: the unit planner
Access to HE Diploma Grading. The Access to HE grading model unit grading all level 3 units (level 2 units will not be graded) no aggregate or single.
Innovative Teaching Dr Carol Robinson Director Mathematics Education Centre.
Working in Partnership to Embed Literacy, Language and Numeracy
Designing an education for life after university: Why is it so difficult? CHEC, South Africa March 2011 A/PROF SIMON BARRIE, THE UNIVERSITY OF SYDNEY.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Flexible Assessment, Tools and Resources for PLAR Get Ready! Go! Presenter: Deb Blower, PLAR Facilitator Red River College of Applied Arts, Science and.
Evaluation of a process for reflection on feedback to support student learning Mark M c Crory Steve M c Peake Denise Currie Department of Management and.
VARIETY VERSUS FAMILIARITY IN ASSESSMENT TASK TYPE IN HIGHER EDUCATION Neil Currant.
Building a World Class School
Competency Management Defining McGill’s Competency Directory MANAGEMENT FORUM JUNE 7, 2005.
Educator Evaluation: A Protocol for Developing S.M.A.R.T. Goal Statements.
According to the CBI (March 2009) employability is: ‘A set of attributes, skills and knowledge that all labour market participants should possess to ensure.
Aim to provide key guidance on assessment practice and translate this into writing assignments.
Standards and Guidelines for Quality Assurance in the European
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
Problem-based learning in a traditional curriculum
LECTURE 2 - DTLLS Assessment. Research into the impact of assessment tells us that students learn best when assessment is:  Evenly timed  Represents.
The contrasting environments that early career academics experience in their departmental teaching and on programmes of initial professional development.
Curriculum development and curriculum assessment TEMPUS: Second consortium meeting, Koblanz Landau, Germany, March 2013 Dr. Roxana Reichman Working team.
24 October 2009ISSOTL Assessment design, pedagogy and practice: What do ‘new’ lecturers think? Lin Norton, Bill Norton, Lee Shannon and Frances.
Providing mentor support for practice educators in training Exploring and evaluating approaches used by Bournemouth University 2010.
Ready for University Longitudinal, comparative evaluation of workshops to help Access and A-level students understand university assessment criteria James.
One Step at a Time: Presentation 8 DISCUSSION SKILLS Introduction Initial Screen Skills Checklist Classroom Intervention Lesson Planning Teaching Method.
New developments in the UK Higher Education
Assessment and Feedback – Module 1
Lin Norton, Bill Norton, Lee Shannon and Frances Phillips
Lin Norton, Bill Norton, Lee Shannon and Frances Phillips
Context: Increase in upper degrees UK-wide
Presentation transcript:

From student to lecturer: 20 years of research on assessment as a timeline Lin Norton & Bill Norton Liverpool Hope University 1

Acknowledgments Research studies funded by: – Liverpool Hope – Assessment Plus (FDTL4 consortium project) – Write Now (HEFCE funded CETL) 2

The power of assessment… ‘Improving student learning implies improving the assessment system. Teachers often assume that it is their teaching that directs student learning. In practice, assessment directs student learning, because it is the assessment system that defines what is worth learning’. (Havnes, 2004, p. 1) ‘Authentic assessment can be defined as assessment that is pedagogically appropriate- it frames students’ views of HE, it has a major influence on their learning and it directs their attention to what is important’ (Boud & Falchikov, 2007) 3

...and the problems Assessment and feedback is seen across the UK sector as problematic: – View that assessment in HE manifests many poor practices (Boud, 1995; Rust, 2007) – Need for more consistency in assessment practice, effective and open use of assessment criteria and better use of feedback to promote learning (QAA 2006; 2008) – Students are least satisfied with assessment and feedback (NSS, ) – ‘Students can, with difficulty, escape from the effects of poor teaching, they cannot, by definition if they want to graduate, escape the effects of poor assessment' (Boud, 1995). 4

5 How the typical UK university student has changed 1963 The way we were2006 How things change Education for the eliteEducation for the ‘ordinary’ 5% of 18 year olds>40% of 18 year olds (still rising) A wide choice of graduate jobs40% start in a non-graduate job Fee paying or selective schoolsNon-selective secondary schools MaleFemale (nearly 60%) Predominantly white middle classWider participation Likely to achieve a 2.2Likely to achieve a 2.1 Times Higher Education Supplement 21 April 2006

6 Pressures on universities Highly competitive global market; ( Bologna process; Leitch, 2006) Widening participation (Dearing report, 1997) Emphasis on employability (HEFCE, 2010) Shrinking resources (Browne, 2010) League tables and student satisfaction measures- National Student Survey (NSS) Teaching Quality Information (TQI) which gives students an informed choice

Why a timeline? We will argue that assessment is a phenomenon continually influenced by external movements (e.g AfL, NSS, QAA) which will go on developing. We will show how the course of research isn’t always driven by external factors We will suggest that the perspective of the key stakeholder (tutor or student) is an important focus and show how our own research has taken us from student to tutor (and back again?) - Two worlds But, in constructing our presentation doubts set in about the usefulness of a timeline…. 7

Assessment as a timeline ? 5. Ass’ment marking feedback 4. Ass’ment design 3. Lecturers’ views 2. Fairness 1. Core criteria Mismatch 8

Reflecting on our assessment research Looking back at our past studies with hindsight and with knowledge of our later research has enriched our interpretation but means we can group our results in many different ways…. This presentation will be largely thematic (and as chronological as possible) The timeline concept is to indicate some forward progression and building on our growing understandings but with the realisation that the context is always changing and there can be no end point Unlike the sat nav, in researching assessment ‘We have NOT reached our destination’ 9

How it all began… Students (N=98) Lecturers (N=6) 1 Answer question 2 Content knowledge2 Understanding 3 Relevant information3 Argument ‘Mismatch’ between students and staff 10 Norton, 1990

Two worlds? The students’ perspective The lecturers’ view 11

Theme 1. How do we help students write better essays? Students don’t seem to realise what lecturers are looking for when marking their essays Is it because we aren’t making our assessment criteria clear enough? Notion of core assessment criteria: – ‘Core criteria are those that appear very commonly in assessment criteria across disciplines and institutions, and that appear to have a central role in the shared perception of what is important in a good student essay’ (Elander et al, 2004) They are:  Addressing the question  Structuring the answer  Demonstrating understanding  Developing argument  Using evidence  Evaluating sources  Use of language 12

13 Students’ understandings of the importance of assessment criteria ‘ In the first year I didn’t really utilise the assessment criteria and therefore I didn’t know what the lecturers were looking for, it seems silly now not to look at the assessment criteria but I guess in the first year I just took it for granted that I would be told everything without having to actually do any independent thinking’ (‘David’, 3 rd year)

14 Students’ views about how staff use assessment criteria “…I find that the lecturers and the way that we are taught and explained to us about assessment it’s sketchy and all over the place for the undergraduates, this is what I am finding at the moment…” (‘Lana’, M.Sc & undergrad conversion diploma) simultaneously, Institution A)

Research interventions Essay Feedback Checklist (Norton & Norton, 200; Norton et al, 2002) Workshops (Harrington et al, 2006; Norton et al, 2005) Book: Writing essays at university. A guide for students by students. (Norton & Pitt et al, 2009 Research findings: Students wanted MORE guidance, MORE practice, MORE feedback 15

16 Do lecturers have shared understandings of core criteria? Q. What do you understand by ‘critical evaluation’? ‘at a ‘primitive’ level: ‘showing some emotion for what you’re doing’, ‘may even mean taking sides when you have very little evidence to support it’, display of personal and emotional involvement in what’s being studied, attempt to give other sides of argument’ (#7)g I f research carried out appropriately” (#2) ‘Tough to define …Perhaps a misnomer; evaluation includes potential for criticism, which ‘entails thinking about theory in relation to both evidence and other theories’ (e.g., does theory stand up in light of empirical evidence., does another theory do a better job of explaining the evidence?)’ (#11)

17 Do lecturers attach same weightings to the core criteria when marking? Q. Are these criteria equally important? ‘… using appropriate language, I don’t think is so important, I’m quite happy if they can criticize and evaluate. That said, it’s unusual to find someone who has the ability to critically evaluate but can’t write properly.’ [# 1]s mere e or good.” ‘Addressing the question obviously. Use of evidence is really important in Psychology and that demonstrates their understanding. I’m less worried about structure in exam work because they’re under pressure. Developing an argument and critically evaluating are done very poorly by students.’ [#4] ‘I think they’re all as important.’ [#6]

Making assessment criteria transparent: What are the advantages for students? Students do not have the same understandings as their lecturers, but make active efforts to improve their essay writing (Higgins et al, 2002) Helps make criteria more explicit and understandable to students (, Elander & Hardman, 2002; O’Donovan, Price & Rust, 2001; Price, Rust & O’Donovan, 2003). Makes the demands of the task clear Enables meaningful feedback & opportunity to improve (Nicol & Macfarlane- Dick, 2006) Meets Quality Assurance Agency (QAA) principles of equity, fairness and accountability 18

Making assessment criteria transparent: What are the advantages for lecturers? Helps to reduce lack of inter-rater reliability on marks (Newstead & Dennis, 1994). Mitigates order and practice effects, fatigue, & personal bias in marking. Sounder overall judgment than intuitive ‘mental model’ (Elander, 2002) Helpful for novice assessors. Useful in defending judgments ( double marking; external examiners). 19

And what are the disadvantages?? Paradoxical effect of encouraging a ‘mechanistic’ (Marton & Saljo, 1997) rather than independent & meaningful approach to learning (Norton, 2004). The strategic student uses assessment criteria in an almost formulaic way to help her/him get the best mark possible- like Miller & Parlett’s (1974) cue-seeker Lecturers often cannot agree on meanings and values attached to criteria Defining criteria in terms of relating to marks results in vague statements like ‘excellent’, ‘good’, ‘adequate’, ‘poor’ Lecturers have mental models of marking which are resistant to new guidelines in applying assessment criteria (Wolf, 1995) Transparency itself is a contested notion (Orr, 2004) 20

21 Theme 2. Pressures and perceptions of ‘fairness’ Student as ‘customer’ - strategic and marks orientated and heavily dependent on lecturers Assessment can be perceived as unfair ( Brunas- Wagstaff & Norton, 1998 ) Assessment often not authentic (Gulikers et al, 2004) Our research shows some of the effects of inauthentic assessment: – ‘Rules of the Game’ – Plagiarism all ‘commonly’ used – Cheating (Norton et al, 1996a.b; Norton et al, 2001)

Theme 3. Bringing the two worlds together If assessment is perceived by students as unfair or inauthentic, how are ‘new’ lecturers introduced to what is currently accepted as desirable assessment practice? Widely held view that assessment should be for rather than just of learning (Black,2006) but… is this what lecturers, particularly ‘new’ lecturers on a university teaching programme think and, if so, are they able to put their beliefs about assessment design into practice? and… what part do these university teaching programmes have to play? 22

23 Interview study with 10 lecturers on PGCert in L&T Q. Do you feel you have the freedom to change your assessment techniques easily? Six said No, four said Yes (but with reservations). Of those who said No: – ‘Not really because it is set in stone in the module proposal. You have to jump through many hoops if you are going to change the assessment techniques.’ (A) – ‘No! I get the impression that they are set in stone…. I think that hurdles of going to various panels to have your module changed puts people off…I get the impression from talking to colleagues that the process is long-winded and bureaucratic.’ (C)

24 Conclusions Our findings suggest that what new academics learn on PGCert courses about assessment may be constrained when they attempt to put their new found pedagogical knowledge into practice. (Norton et al, 2010) This may be because of a complex interaction of institutional, departmental and individual factors ( Becher & Trowler, 2001; Fanghanel, 2007). Led to our next studies on looking at desirable assessment practice and possible constraints ( Questionnaire with 586 ‘new’ lecturers and Interviews with 30 ‘experienced’ lecturers –Norton et al, 2011; Sahnnon et al, 2009).

25 Theme 4 Assessment Design: the lecturers’ perspective Main research questions : 1.What do new lecturers (recently qualified or qualifying) think about current assessment design practices as a result of having undergone a university teaching programme? 2.Would they think they were able to put into practice ‘desirable’ assessment features, or would they feel there were constraints that would make this difficult?

The Assessment Design Inventory Desirable practice (N=586) I design my assessments to help students take responsibility for their own learning progress. 86% In my practice I emphasise assessment for learning rather than assessment of learning. 75% Involving students in the assessment design would encourage them to engage in the assessment task. 73% Constraints (N=586) Changes to my assessment design are sometimes hindered by external factors (e.g. cost, high student numbers, time). 75% There is little incentive for lecturers to innovate in their assessment practice. 61% It is possible for students to ‘go through the motions’ to satisfy assessment requirements without learning anything. 57% 26

27 Assessment design: further findings Desirable assessment design practice was more likely to be reported by lecturers who had obtained their HE teaching qualification, had over 8 years teaching experience, were female, taught in soft applied disciplines and who worked in polytechnic or modern universities. Potential constraints to desirable assessment design practice were more likely to be reported by lecturers who had less than 8 years teaching experience, were male, taught in hard pure disciplines and who worked in traditional universities

Experienced lecturers’ learning and teaching orientations From our interview study we have found two ‘orientations’ which we have tentatively identified as: 1.Professional: emphasis on skills and knowledge (N=6) ‘Learning is acquiring knowledge and hopefully an understanding of the context into which that knowledge can be put - so it isn’t purely an acquisition of knowledge process, you’ve got to be able to use that knowledge for it to be learning’ 2.Developmental: emphasis on skills, knowledge AND personal development (N=11): ‘An acquisition of knowledge, skills, techniques, an evolution, a maturing, a partnership, a process.’ Shannon et al,

Do lecturers’ orientations affect how they view assessment design? Lecturers with a professional orientation defined assessment in terms of criteria, learning objectives and learning outcomes: – Assessment characterised as both a standardised and standardising event, – the focus is on ensuring parity, that all students awarded a given grade should meet certain objective standards. Lecturers with a developmental orientation take a more student centred approach by focussing on the student’s perspective of the assessment regime: – Assessment is seen as a range of tasks to suit a range of different learners 29

Do lecturers’ orientations affect how they view feedback? Lecturers with a professional orientation saw feedback primarily as a post assessment activity, a means of propelling the student to better grades in the next assignment and a process in which the students are recipients of, rather than active participants in, the feedback cycle: ‘When a student has read a bit of feedback they should be able to work out from that feedback what they need to do differently, what they need to do to improve…’ Lecturers with an experiential orientation conceptualised feedback less as a post assessment activity and more in terms of relationship building, a continual activity: ‘…there needs to be a chance for students to actually feedback on the feedback, to come back and say actually I didn’t agree with that or what did you mean by that but, again, that’s part of the whole process’. 30

Theme 5. Disciplinary views on assessment methods, marking and feedback Assessment, Marking & Feedback Inventory (AMFI) study with 45 lecturers (Hope): Some indications that hard applied lecturers were ‘more traditional’ in their approach to marking and feedback, they were less keen to spend time on it and used fewer assessment methods than lecturers from the other two disciplines This lends some support to our earlier finding which showed hard applied lecturers were more likely to be constrained in their assessment design. (Norton et al, 2011) 31

From discipline to revisiting orientation New revised and developed version of AMFI together with ADI piloted on 30 lecturers at Hope Two statements devised to identify their ‘orientations’: – Professional: Assessment is primarily about upholding standards to feed into the professional/vocational role that students may fill. Feedback is mainly post assessment and initiated to help students improve grades – Developmental: Assessment is mainly about encouraging students’ personal growth and development. Feedback is on-going and involves a two way relationship involving dialogue over time 32

Preliminary AMFI findings Some small significant differences: lecturers with a professional orientation more likely to: – see QAA requirements as a restriction lecturers with a developmental orientation more likely to agree that: – new assessment methods are needed – there's little incentive to innovate. – students focussing on grades more than on learning is a failure of the educational system 33

More complex than a timeline? The student voice The lecturer perspective The research Is it time to go back to the students? 34