Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1.

Similar presentations


Presentation on theme: "Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1."— Presentation transcript:

1 Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1

2 By the end of this session you will be able to : Define assessment, feedback and evaluation Discuss how these are related and how they differ Discuss the application of each in clinical education. Begin to apply them in practice Learning outcomes 2

3 Definitions Assessment : theory & practice Tea break Feedback Evaluation : theory & practice Questions and close Lesson Plan 3

4 Assessment ? Feedback Evaluation ? Definitions ? 4

5 Assessment : definition “The processes and instruments applied to measure the learner’s achievements, normally after they have worked through a learning programme of one sort or another” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 5

6 Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 6

7 Evaluation : definition “A systematic approach to the collection, analysis and interpretation of information about any aspect of the conceptualisation, design, implementation and utility of educational programmes” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 7

8 Assessment Part 1 8

9 In this section: Purposes of assessment Miller’s pyramid The utility function 9

10 Why assess ? 10

11 Why assess ? 1 of 2 To inform students of strengths and weaknesses. To ensure adequate progress has been made before students move to the next level. To provide certification of a standard of performance. 11

12 Why assess ? 2 of 2 To indicate to students which parts of the curriculum are considered important. To select for a course or career. To motivate students in their studies. To measure the effectiveness of teaching and to identify weaknesses in the curriculum. 12

13 Summative Formative 13

14 Clinical Education : Assessment Methods Written Assessments Observed clinical practice Others : Vivas Portfolios … 14

15 How a skill is acquired Cognitive phase Fixative phase Practice Feedback Autonomous phase Fitts P & Posner M (1967) Human Performance 15

16 Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows 16

17 17

18 18

19 Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows OSCE Written Exams OSLER Clinical Work Observed ACAT, CbD, CeX, Short Answer-Reasoning MCQ

20 Question : How can we tell whether these tests are any good or not ? Answer : We do the maths. 20

21 Utility function U = Utility R = Reliability V = Validity E = Educational impact A = Acceptability C = Cost W = Weight U = w r R x w v V x w e E x w a A x w c C Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1, 41-67. 21

22 The Assessment Pentagram ValidityReliability Feasibility Acceptability Educational Impact 22

23 Validity & reliability Validity : the extent to which the competence that the test claims to measure is actually being measured. Reliability : the extent to which a test yields reproducible results. Schuwirth & van der Vleuten (2006) How to design a useful test : the principles of assessment 23

24 Messick (1994) Educational Researcher 23 : 13 “The degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores or other modes of assessment.” Validity : another definition 24

25 Some causes of low validity Vague or misleading instructions to candidates. Inappropriate or overcomplicated wording. Too few test items. Insufficient time. Inappropriate content. Items too easy or too difficult. McAleer (2005) Choosing Assessment Instruments 25

26 Some causes of low reliability Inadequate sampling. Lack of objectivity in scoring. Environmental factors. Processing errors. Classification errors. Generalisation errors. Examiner bias. McAleer (2005) Choosing Assessment Instruments 26

27 Types of validity Face Predictive Concurrent Content Construct 27

28 “The examination fairly and accurately assessed my ability” 28

29 “The examination fairly and accurately assessed the candidates’ ability” 29

30 Problem : Appearances can be deceptive. 30

31 Types of reliability Test-retest Equivalent forms Split-half Interrater and intrarater 31

32 32

33 33

34 The Assessment Pentagram ValidityReliability Feasibility Acceptability Educational Impact 34

35 Utility function U = Utility R = Reliability V = Validity E = Educational impact A = Acceptability C = Cost W = Weight U = w r R x w v V x w e E x w a A x w c C Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1, 41-67. 35

36 Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows 36

37 Miller GE (1990) Acad Med (Suppl) 65 : S63 Does Shows how Knows how Knows OSCE Written Exams OSLER Clinical Work Observed ACAT, CbD, CeX, Short Answer-Reasoning MCQ

38 FY Workplace Assessment Mini-CEX (from USA): Clinical Examination DOPS (developed by RCP): Direct Observation of Procedural Skills CBD (based on GMC performance procedures): Case-based Discussion MSF (from industry): Multi-Source Feedback Carr (2006) Postgrad Med J 82: 576 38

39 Practical Exercise 39

40 Educational interventions How will we assess? How will we feedback? How will we evaluate? 40

41 Educational interventions Communication skills for cancer specialists 2 nd year medical speciality training Medical humanities SSM for medical students Masters-level pharmacology module Procedural skills for medical students Clinical Officers: ETATMBA 41

42 Communication skills Learning outcome To improve communication skills of HCPs working with individuals with cancer, e.g. with respect to breaking bad news, discussion of management plans, end of life care Duration2 days StudentsMostly specialist cancer nurses, including Macmillan nurses, also consultants and trainee medics. N = 30. Teaching & learning Mainly consultations with simulated patients 42

43 Speciality training Learning outcome To ensure trainees have reached the appropriate stage in the acquisition of the knowledge, skills and attitudes necessary to independent medical practice Duration1 year StudentsSecond-year GP trainees. N = 50. Teaching & learning Clinical apprenticeship, protected training days 43

44 Medical humanities SSM Learning outcome By a reading of Middlemarch by George Eliot, to enhance students ability to reflect on medical practice and to enter imaginatively into the lives of others Duration90-minute sessions weekly for 10 weeks StudentsSecond-year medical students. N = 20. Teaching & learning Small group teaching and discussion 44

45 M-level pharmacology module Learning outcome To enhance knowledge and understanding of pharmacotherapies used in diabetes and its complications, and to develop the ability to apply this knowledge in clinical practice Duration200 hours, i.e. 20 CATS points StudentsMostly DSNs, a few GPs and endocrinology trainees, some from overseas. N = 20. Teaching & learning 20 hours directed learning – small group teaching and discussion – and 180 hours self-directed learning 45

46 Procedural skills Learning outcome To ensure newly-qualified doctors are competent in all the bedside and near- patient procedures listed in Tomorrow’s Doctors Duration4 years StudentsMedical students. N = 200. Teaching & learning Small group teaching sessions distributed across three hospitals 46

47 Clinical Officers Learning outcome To………………………………………………… …………………………………………………… …………………………………………………… …………………………………………………… …………………………………………………… …………………………….. Durationx……… Studentsy……………………. Teaching & learning z……. 47

48 The ideal assessment instrument : Totally valid. Perfectly reliable. Entirely feasible. Wholly acceptable. Huge educational impact. 48

49 Feedback Part 2 49

50 Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 50

51 Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 51

52 Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 52

53 Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189 53

54 In this section: Importance of feedback How to give feedback: models How to improve feedback 54

55 Experiential learning

56 Feedback Its value is self-evident : experiential learning cannot take place without it It is often infrequent, untimely, unhelpful or incomplete It is often not acted upon to improve performance 56

57 How to give feedback Establish appropriate interpersonal climate Use appropriate location Establish mutually agreed goals Elicit learner’s thoughts and feelings Reflect on observed behaviours Be non-judgmental Relate feedback to specific behaviours Offer right amount of feedback Offer suggestions for improvement Hewson MG & Little ML (1998) J Gen Int Med 13 (2) : 111 57

58 Practical Exercise 58

59 Some methods of feedback Pendleton's rules ALOBA SCOPME model Chicago model 59

60 Pendleton’s Rules Clarification of matters of fact Trainee identifies what went well Trainer identifies what went well Trainee discusses what did not do well and how to improve Trainer identifies areas for improvement Agreement on areas for improvement and formulation of action plan Pendleton D et al (1984) in The Consultation : an Approach to Learning and Teaching 60

61 Difficulties with Pendleton ? The strict format may inhibit spontaneous discussion. Unhelpful polarisation between “good points” and “bad points” Opening comments may seem predictable, insincere and a merely a prelude to criticism. Carr (2006) Postgrad Med J 82: 576 61

62 ALOBA 1 of 2 Start with the learner’s agenda. Look at the outcomes the learner and the patient are trying to achieve. Encourage self-assessment and self-problem solving first. Involve the whole group in problem-solving. Use descriptive feedback to encourage a non- judgemental approach. Provide balanced feedback. Kurtz et al (1998) Teaching and Learning Communication Skills in Medicine 62

63 ALOBA 2 of 2 Make offers and suggestions: generate alternatives. Be well-intentioned, valuing and supportive. Rehearse suggestions. Value the interview as a gift of raw material for the group. Opportunistically introduce theory, research evidence and wider discussion. Structure and summarise learning to reach a constructive end-point. Kurtz et al (1998) Teaching and Learning Communication Skills in Medicine 63

64 SCOPME Listen Reflect back Support Counsel Treat information in confidence Inform without censuring 64

65 Chicago Review aims and objectives of the job at the start. Give interim feedback of a positive nature. Ask the learner to give a self-assessment of their progress. Give feedback on behaviours rather than personality. Give specific examples to illustrate your views. Suggest specific strategies to the learner to improve performance. 65

66 Improving feedback Recognise that we all need feedback to learn and improve Ask for feedback yourself and model this process for learners Inform learners that you expect them to ask for feedback Make feedback a routine activity Discuss the need of feedback with colleagues Sergeant J & Mann K in Cantillon P & Wood D (eds) ABC of Learning and Teaching in Medicine 2 nd edn (2010) 66

67 Evaluation Part 3 67

68 Evaluation : definition “A systematic approach to the collection, analysis and interpretation of information about any aspect of the conceptualisation, design, implementation and utility of educational programmes” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals 68

69 In this section: Purposes of evaluation Data sources Kirkpatrick’s heirarchy 69

70 The Audit Cycle Action plan Re-auditReview literature Set criteria & standards Design audit Feed back findingsCollect data Analyse data Ask question(s) Review standards Wakley G & Chambers R (2005) Clinical Audit in Primary Care 70

71 Why evaluate ? To ensure teaching is meeting students’ needs. To identify areas where teaching can be improved. To inform the allocation of faculty resources. To provide feedback and encouragement to teachers To support applications for promotion by teachers. To identify and articulate what is valued by medical schools. To facilitate development of the curriculum. Morrison (2003) Br Med J 326 : 385 71

72 Evaluation Scale : micro  macro Formative  summative Internal  external Can you evaluate an assessment ? 72

73 Evaluation : data sources Student ratingsEmployer ratings Peer ratingsVideo recordings Self-ratingsAdministrator ratings Assessment scoresTeacher scholarship Expert ratingsTeacher awards Student interviewsTeaching portfolios Exit ratings Based on : Berk RA (2006) Thirteen Strategies to Measure College Teaching 73

74 74

75 75

76 A method of evaluating teaching Different models and purposes Three stages: pre-observation, observation, post-observation Form (instrument) for recording the information, observation and feedback Siddiqui ZS, Jonas-Dwyer D & Carr SE (2007) Twelve tips for peer observation of teaching. Medical Teacher 29:297-300 Teaching Observation 76

77 Evaluation : authority / summative Developmental : expert / formative Peer review : collaborative / mutual learning Teaching Observation : purposes 77

78 78

79 Evaluation : triangulation SelfPeers Students 79

80 Practical Exercise 80

81 Results Behaviour Learning Reaction Kirkpatrick’s Hierarchy  Complexity of behaviour  Time elapsed  Reliable measures  Confounding factors Hutchinson (1999) Br Med J 318 : 1267 81

82 What’s the point ? 82

83 83

84 Issues with Kirkpatrick Is it a hierarchy ? Omissions : learning objectives ? Linkages between levels ? “Kirkpatrick plus” Tamkin P et al (2002) Kirkpatrick and Beyond, IES 84

85 There is no perfect assessment instrument Feedback to students is essential to experiential learning The ultimate purpose of evaluation is to improve clinical outcomes Conclusions 85


Download ppt "Assessment, Feedback and Evaluation Vinod Patel & John Morrissey 1."

Similar presentations


Ads by Google