Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Educational Supervision & Find Your Way Around in the E-portfolio Dr Jane Mamelok RCGP WPBA Clinical Lead.
Workplace assessment Dr. Kieran Walsh, Editor, BMJ Learning. 2.
Feedback in Medical Education Ravi Seyan. Introduction giving and receiving feed is a a part of learning at all levels It is especially applicable when.
School of Surgery Induction Day ISCP Session. Overview ISCP aims and benefits Roles and responsibilities ISCP website Learning Agreements Syllabus Assessment.
The Research Consumer Evaluates Measurement Reliability and Validity
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
FEEDBACK. Learning depends on self-regulation Self regulation depends on learners being able to access information that tells them the gap between where.
Promoting Excellence in Family Medicine nMRCGP Workplace-based Assessment March 2007.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
Constructing a test. Aims To consider issues of: Writing assessments Blueprinting.
Formative and Summative Evaluations
Assessment of Clinical Competence in Health Professionals Education
Workplace-based Assessment
Measuring Learning Outcomes Evaluation
Assessing and Evaluating Learning
Assessment of Communication Skills in Medical Education
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Module 1 Introduction to SRL. Aims of the Masterclass Understand the principles of self regulated learning (SRL) and how they apply to GP training Develop.
Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology & Toxicology PIES Seminar
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Training for Public Health Trainers.  Prof. John Collins’ report ‘Foundation for Excellence’ highlighted many positive aspects of the Curriculum but.
Diploma in Teaching in the Lifelong Learning Sector
‘Do we need exams?’ Wendy Reid Medical Director HEE Past – Vice President RCOG.
Intending Trainers Course. 1. Communication and consultation skills – communication with patients, and the use of recognised consultation techniques 2.
Kazakhstan Centres of Excellence Teacher Education Programme Assessment of teachers at Level Two.
Bob Woodwards SAC Chair, Oral and Maxillofacial Surgery.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Feedback on Work-Place Based Assessment (WPBA) Barnsley 24 May 2011.
Learner Assessment Win May. What is Assessment? Process of gathering and discussing information from multiple sources to gain deep understanding of what.
Understanding Meaning and Importance of Competency Based Assessment
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Problem based learning (PBL) Amal Al Otaibi CP, MME.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
IMPORTANCE &SKILLS OF GIVING FEEDBACK Dr. A.K.Pathak ELMC,Lucknow.
Giving effective performance feedback. Session objectives Identify the uses of feedback Explore the methods of providing feedback to learners Explore.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Assessment Tools.
Presenter Shereen Khan August 17 th,  A school based activity that engages teachers in meaningful, non-judgmental and on-going instructional dialogue.
Get Your Head Around De-Briefing Jim Walker 2013.
Assessing Your Learner Lawrence R. Schiller, MD, FACG Digestive Health Associates of Texas Baylor University Medical Center, Dallas.
Consultation Analysis VTS 22/9/04. Consultation Models z Stott & Davis z Pendleton et al z Roger Neighbour z Cambridge-Calgary.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
E-portfolio By Carol, Sally and Barry. Where does my e-portfolio fit in? Knows (AKT) Can (CSA) Does (e-portfolio) It’s the ‘doing’ that is the most.
School of Clinical Medicine School of Clinical Medicine UNIVERSITY OF CAMBRIDGE Feedback Jonathan Silverman Aarhus 2012.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
INTRODUCTION TO ASSESSMENT METHODS USED IN MEDICAL EDUCATION AND THEIR RATIONALE.
Workplace-Based Assessment Making it More Effective Jonathan Beard Professor of Surgical Education University of Sheffield, UK.
Observation System Kidderminster College January 2012.
How to make the most of wpba. (workshop) Andrew Whitehouse Jan
Making assessment meaningful and useful…….. Andrew Whitehouse January 25th 2011.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
1. Mini-Clinical Evaluation Exercise (mini-CEX) 2.
Workplace Based Assessments
Feedback.
Training Trainers and Educators Unit 8 – How to Evaluate
The Big Picture – curricula, the Gold Guide and the assessment system
In-Service Teacher Training
Clinical Assessment Dr. H
Training Trainers and Educators Unit 8 – How to Evaluate
Assessment 101 Zubair Amin MD MHPE.
FEEDBACK Dr. Mohammed Moizuddin Khan Associate Professor.
Providing feedback to learners
Feedback.
Presentation transcript:

Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1

By the end of this session you will be able to : Define assessment, feedback and evaluation Discuss how these are related and how they differ Discuss the application of each in clinical education. Begin to apply them in practice Learning outcomes 2

Definitions Assessment : theory & practice Tea break Feedback Evaluation : theory & practice Questions and close Lesson Plan 3

Assessment ? Feedback Evaluation ? Definitions ? 4

Assessment : definition “The processes and instruments applied to measure the learner’s achievements, normally after they have worked through a learning programme of one sort or another” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 5

Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 6

Evaluation : definition “A systematic approach to the collection, analysis and interpretation of information about any aspect of the conceptualisation, design, implementation and utility of educational programmes” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 7

Assessment Part 1 8

In this section: Purposes of assessment Miller’s pyramid The utility function 9

Why assess ? 10

Why assess ? 1 of 2 To inform students of strengths and weaknesses. To ensure adequate progress has been made before students move to the next level. To provide certification of a standard of performance. 11

Why assess ? 2 of 2 To indicate to students which parts of the curriculum are considered important. To select for a course or career. To motivate students in their studies. To measure the effectiveness of teaching and to identify weaknesses in the curriculum. 12

Summative Formative 13

Clinical Education : Assessment Methods Written Assessments Observed clinical practice Others : Vivas Portfolios … 14

How a skill is acquired Cognitive phase Fixative phase Practice Feedback Autonomous phase Fitts P & Posner M (1967) Human Performance 15

Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows 16

17

18

Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows OSCE Written Exams OSLER Clinical Work Observed ACAT, CbD, CeX, Short Answer-Reasoning MCQ

Question : How can we tell whether these tests are any good or not ? Answer : We do the maths. 20

Utility function U = Utility R = Reliability V = Validity E = Educational impact A = Acceptability C = Cost W = Weight U = w r R x w v V x w e E x w a A x w c C Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1,

The Assessment Pentagram ValidityReliability Feasibility Acceptability Educational Impact 22

Validity & reliability Validity : the extent to which the competence that the test claims to measure is actually being measured. Reliability : the extent to which a test yields reproducible results. Schuwirth & van der Vleuten (2006) How to design a useful test : the principles of assessment 23

Messick (1994) Educational Researcher 23 : 13 “The degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores or other modes of assessment.” Validity : another definition 24

Some causes of low validity Vague or misleading instructions to candidates. Inappropriate or overcomplicated wording. Too few test items. Insufficient time. Inappropriate content. Items too easy or too difficult. McAleer (2005) Choosing Assessment Instruments 25

Some causes of low reliability Inadequate sampling. Lack of objectivity in scoring. Environmental factors. Processing errors. Classification errors. Generalisation errors. Examiner bias. McAleer (2005) Choosing Assessment Instruments 26

Types of validity Face Predictive Concurrent Content Construct 27

“The examination fairly and accurately assessed my ability” 28

“The examination fairly and accurately assessed the candidates’ ability” 29

Problem : Appearances can be deceptive. 30

Types of reliability Test-retest Equivalent forms Split-half Interrater and intrarater 31

32

33

The Assessment Pentagram ValidityReliability Feasibility Acceptability Educational Impact 34

Utility function U = Utility R = Reliability V = Validity E = Educational impact A = Acceptability C = Cost W = Weight U = w r R x w v V x w e E x w a A x w c C Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1,

Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows 36

Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows OSCE Written Exams OSLER Clinical Work Observed ACAT, CbD, CeX, Short Answer-Reasoning MCQ

FY Workplace Assessment Mini-CEX (from USA): Clinical Examination DOPS (developed by RCP): Direct Observation of Procedural Skills CBD (based on GMC performance procedures): Case-based Discussion MSF (from industry): Multi-Source Feedback Carr (2006) Postgrad Med J 82:

Practical Exercise 39

Educational interventions How will we assess? How will we feedback? How will we evaluate? 40

Educational interventions Communication skills for cancer specialists 2 nd year medical speciality training Medical humanities SSM for medical students Masters-level pharmacology module Procedural skills for medical students Clinical Officers: ETATMBA 41

Communication skills Learning outcome To improve communication skills of HCPs working with individuals with cancer, e.g. with respect to breaking bad news, discussion of management plans, end of life care Duration2 days StudentsMostly specialist cancer nurses, including Macmillan nurses, also consultants and trainee medics. N = 30. Teaching & learning Mainly consultations with simulated patients 42

Speciality training Learning outcome To ensure trainees have reached the appropriate stage in the acquisition of the knowledge, skills and attitudes necessary to independent medical practice Duration1 year StudentsSecond-year GP trainees. N = 50. Teaching & learning Clinical apprenticeship, protected training days 43

Medical humanities SSM Learning outcome By a reading of Middlemarch by George Eliot, to enhance students ability to reflect on medical practice and to enter imaginatively into the lives of others Duration90-minute sessions weekly for 10 weeks StudentsSecond-year medical students. N = 20. Teaching & learning Small group teaching and discussion 44

M-level pharmacology module Learning outcome To enhance knowledge and understanding of pharmacotherapies used in diabetes and its complications, and to develop the ability to apply this knowledge in clinical practice Duration200 hours, i.e. 20 CATS points StudentsMostly DSNs, a few GPs and endocrinology trainees, some from overseas. N = 20. Teaching & learning 20 hours directed learning – small group teaching and discussion – and 180 hours self-directed learning 45

Procedural skills Learning outcome To ensure newly-qualified doctors are competent in all the bedside and near- patient procedures listed in Tomorrow’s Doctors Duration4 years StudentsMedical students. N = 200. Teaching & learning Small group teaching sessions distributed across three hospitals 46

Clinical Officers Learning outcome To………………………………………………… …………………………………………………… …………………………………………………… …………………………………………………… …………………………………………………… …………………………….. Durationx……… Studentsy……………………. Teaching & learning z……. 47

The ideal assessment instrument : Totally valid. Perfectly reliable. Entirely feasible. Wholly acceptable. Huge educational impact. 48

Feedback Part 2 49

Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2):

Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2):

Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2):

Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2):

In this section: Importance of feedback How to give feedback: models How to improve feedback 54

Experiential learning

Feedback Its value is self-evident : experiential learning cannot take place without it It is often infrequent, untimely, unhelpful or incomplete It is often not acted upon to improve performance 56

How to give feedback Establish appropriate interpersonal climate Use appropriate location Establish mutually agreed goals Elicit learner’s thoughts and feelings Reflect on observed behaviours Be non-judgmental Relate feedback to specific behaviours Offer right amount of feedback Offer suggestions for improvement Hewson MG & Little ML (1998) J Gen Int Med 13 (2) :

Practical Exercise 58

Some methods of feedback Pendleton's rules ALOBA SCOPME model Chicago model 59

Pendleton’s Rules Clarification of matters of fact Trainee identifies what went well Trainer identifies what went well Trainee discusses what did not do well and how to improve Trainer identifies areas for improvement Agreement on areas for improvement and formulation of action plan Pendleton D et al (1984) in The Consultation : an Approach to Learning and Teaching 60

Difficulties with Pendleton ? The strict format may inhibit spontaneous discussion. Unhelpful polarisation between “good points” and “bad points” Opening comments may seem predictable, insincere and a merely a prelude to criticism. Carr (2006) Postgrad Med J 82:

ALOBA 1 of 2 Start with the learner’s agenda. Look at the outcomes the learner and the patient are trying to achieve. Encourage self-assessment and self-problem solving first. Involve the whole group in problem-solving. Use descriptive feedback to encourage a non- judgemental approach. Provide balanced feedback. Kurtz et al (1998) Teaching and Learning Communication Skills in Medicine 62

ALOBA 2 of 2 Make offers and suggestions: generate alternatives. Be well-intentioned, valuing and supportive. Rehearse suggestions. Value the interview as a gift of raw material for the group. Opportunistically introduce theory, research evidence and wider discussion. Structure and summarise learning to reach a constructive end-point. Kurtz et al (1998) Teaching and Learning Communication Skills in Medicine 63

SCOPME Listen Reflect back Support Counsel Treat information in confidence Inform without censuring 64

Chicago Review aims and objectives of the job at the start. Give interim feedback of a positive nature. Ask the learner to give a self-assessment of their progress. Give feedback on behaviours rather than personality. Give specific examples to illustrate your views. Suggest specific strategies to the learner to improve performance. 65

Improving feedback Recognise that we all need feedback to learn and improve Ask for feedback yourself and model this process for learners Inform learners that you expect them to ask for feedback Make feedback a routine activity Discuss the need of feedback with colleagues Sergeant J & Mann K in Cantillon P & Wood D (eds) ABC of Learning and Teaching in Medicine 2 nd edn (2010) 66

Evaluation Part 3 67

Evaluation : definition “A systematic approach to the collection, analysis and interpretation of information about any aspect of the conceptualisation, design, implementation and utility of educational programmes” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 68

In this section: Purposes of evaluation Data sources Kirkpatrick’s heirarchy 69

The Audit Cycle Action plan Re-auditReview literature Set criteria & standards Design audit Feed back findingsCollect data Analyse data Ask question(s) Review standards Wakley G & Chambers R (2005) Clinical Audit in Primary Care 70

Why evaluate ? To ensure teaching is meeting students’ needs. To identify areas where teaching can be improved. To inform the allocation of faculty resources. To provide feedback and encouragement to teachers To support applications for promotion by teachers. To identify and articulate what is valued by medical schools. To facilitate development of the curriculum. Morrison (2003) Br Med J 326 :

Evaluation Scale : micro  macro Formative  summative Internal  external Can you evaluate an assessment ? 72

Evaluation : data sources Student ratingsEmployer ratings Peer ratingsVideo recordings Self-ratingsAdministrator ratings Assessment scoresTeacher scholarship Expert ratingsTeacher awards Student interviewsTeaching portfolios Exit ratings Based on : Berk RA (2006) Thirteen Strategies to Measure College Teaching 73

74

75

A method of evaluating teaching Different models and purposes Three stages: pre-observation, observation, post-observation Form (instrument) for recording the information, observation and feedback Siddiqui ZS, Jonas-Dwyer D & Carr SE (2007) Twelve tips for peer observation of teaching. Medical Teacher 29: Teaching Observation 76

Evaluation : authority / summative Developmental : expert / formative Peer review : collaborative / mutual learning Teaching Observation : purposes 77

78

Evaluation : triangulation SelfPeers Students 79

Practical Exercise 80

Results Behaviour Learning Reaction Kirkpatrick’s Hierarchy  Complexity of behaviour  Time elapsed  Reliable measures  Confounding factors Hutchinson (1999) Br Med J 318 :

What’s the point ? 82

83

Issues with Kirkpatrick Is it a hierarchy ? Omissions : learning objectives ? Linkages between levels ? “Kirkpatrick plus” Tamkin P et al (2002) Kirkpatrick and Beyond, IES 84

There is no perfect assessment instrument Feedback to students is essential to experiential learning The ultimate purpose of evaluation is to improve clinical outcomes Conclusions 85