Presentation is loading. Please wait.

Presentation is loading. Please wait.

Programmatic assessment for learning an example of medical education design.

Similar presentations


Presentation on theme: "Programmatic assessment for learning an example of medical education design."— Presentation transcript:

1 programmatic assessment for learning an example of medical education design

2 assignment Build an assessment programme for a workplace-based learning curriculum. -GP training -practice, assignments plus day-release education -1 year, two semesters, 4 terms -supervisor, medical educator, on-line platform

3 overview a bit of history programmatic assessment the assessment programme

4 old model of medical competence knowledge TEST skills TEST problem solving TEST attitudes TEST

5 the quest for the best test history - oral versus written - open versus closed items - computer versus paper-and-pencil - knowledge versus insight - norm referenced versus criterion referenced....and many more

6 typical approach to assessment test pass fail competent incompetent

7 competent incompetent pass fail but….

8 assessment is not non-obtrusive, so succeptible to many biases and unwanted effects  what does that mean for quality of a test?  major problems competence is much more complex than expected  no extrinsic standards for criterion validity exist 

9 U = RVECA  R = reliability V = validity E = educational impact C = costs A = acceptance quality elements of assessment Van der Vleuten CPM. The assessment of Professional Competence: Developments, Research and Practical Implications. Advances in Health Science Education 1996;1(1):41-67.

10 reliability test test’ Penny Leonard Amy Howard 85% 73% 59% 51% Penny Leonard Amy Howard 85% 73% 59% 51% cut-off

11 validity: Kane’s view observation observed score universe score target domain construct practice observation for clinical reasoning expertise, observation standards, scales, forms reliability, reproducibility saturation, expertise relation with multiple choice key-feature, EMQ think aloud, CRT, SCT

12 educational impact curriculum teachersstudents assessment content format scheduling regulatory structure

13 educational impact Cilliers, F. J., Schuwirth, L. W. T., Herman, N., Adendorff, H., & van der Vleuten, C. P. M. (2012). A model of the pre-assessment learning effects of summative assessment in medical education. Advances in Health Sciences Education, 17(1),

14 U = RwRw VwVw EwEw CwCw AwAw  R = reliability V = validity E = educational impact C = costs A = acceptance quality elements of assessment Van der Vleuten CPM. The assessment of Professional Competence: Developments, Research and Practical Implications. Advances in Health Science Education 1996;1(1): W = weight

15 But…. testing requires some strange assumptions

16 underlying concepts traits are stable and generic characteristics

17 1 underlying concepts: stable trait 234 A B C T1’2’3’4’ A B C PT

18 1 underlying concepts: stable trait 234 A B C T1’2’3’4’ A B C PT

19 traits are stable and generic characteristics individual items in themselves are meaningless underlying concepts

20 underlying concepts: meaningless items Ms. Smit is 72 years old. She has angina pectoris. Several times her blood pressure is taken and found to be 170/100 mmHg. Which antihypertensive drug is most indicated for her?? acaptopril. bchloorthalidon. cmetoprolol.

21 Mr. Johnson, 35 years old, consults his GP with complaints of chest pain. Without further information about Mr. Johnson the most likely origin of his chest pain is: athe chest wall; bthe lungs; cthe myocardium; dthe esophagus. underlying concepts: meaningless items

22 resuscitation ‘station’ in a skills test

23 underlying concepts: meaningless items communication ‘station’ in a skills test

24 underlying concepts traits are stable and generic characteristics individual items in themselves are meaningless sum scores determine what the test measures statistics are based on elimination of information

25 underlying concepts: reductionism a c b a a e answer b c a a b e key % 0% 100% failed

26 underlying concepts traits are stable and generic characteristics individual items in themselves are meaningless sum scores determine what the test measures statistics are based on elimination of information one single best instrument for each trait

27 old model of medical competence knowledge TEST skils TEST problem solving TEST attitudes TEST

28 competencies competencies are simple or more complex tasks a successful candidate must be able to handle, and during which s/he uses at the right time the correct and relevant knowledge, skills, attitudes and meta-cognitions to manage the situation successfully.

29 competency domains or roles National Dutch blue print: 1medical expert 2scientist 3worker in the health care system 4person

30 overview a bit of history programmatic assessment the assessment programme

31 from building blocks…

32 …to buildings

33 from methods to programmes ► multiple instruments, various formats ► strengths and weaknesses combined ► assessment moments ≠ decision moments

34 every assessment moment is a decision moment test decision test decision test decision + + = competent

35 every assessment moment is NOT a decision moment A A A A A A A A A A A A A A A A A A low stakesmedium stakeshigh stakes

36 from methods to programmes ► multiple instruments, various formats ► strengths-weaknesses combined ► assessment moment ≠ decision moment ► multiple quality approaches

37 quality: reliability - consistency - saturation - expertise - organisation

38 reliability is sampling testing time in hours MCQ short essay paper cases orals obser- vation assessment practice Video test Norcini et al., Stalenhoef-Halling et al., Swanson, Newble & Swanson, Ram et al., 1999

39 generalisability: saturation orange green blue red yellow purple black nothing new

40 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

41 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

42 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

43 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

44 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

45 Overview Sept Dec Mar May June Introduction to students Mentors are trained First portfolio submission Formative review Examiner training (benchmark portfolios) 2nd portfolio submission Summative review Steps in the year Mentor/student Recommendation F/P/D Exam committee decision Yes Mentor agrees with Student Examiner agrees with Mentor Examiner 1 Ex 1 agrees with Ex 2 Final Judgment Examiner 2 Examiner 1 Full committee (n=20) Portfolio Committee Assessment Procedure Yes No

46 from methods to programmes ► multiple instruments, various formats ► strengths-weaknesses combined ► assessment moment ≠ decision moment ► multiple quality approaches ► many instruments : many competency domains

47 ABCD 1 role  1 instrument instruments med expert scientist worker in HCS person domains

48 multi-modal assessment med expert scientist worker in HCS person domains instruments

49 from methods to programmes ► multiple instruments, various formats ► strengths-weaknesses combined ► assessment moment ≠ decision moment ► multiple quality approaches ► many instruments : many competency domains ► integrative  holistic not reductionist

50 overview a bit of history programmatic assessment the assessment programme

51 assignment Build an assessment programme for a workplace-based learning curriculum. -GP training -practice, assignments plus day-release education -1 year, two semesters, 4 terms -supervisor, medical educator, on-line platform

52 design goals and stated purpose programme in action supporting the programme documenting of the programme improvement approaches to the programme accounting for the programme Dijkstra J, Van der Vleuten C, Schuwirth L. A new framework for designing programmes of assessment. Advances in health sciences education 2010;15. :379–93.

53 If ‘incompetence’ were an illness, how would we diagnose and treat it?

54 design multiple instruments meaningful collation learning focused self-regulation assessment moment ≠ decision moment longitudinal feasible and efficient

55 purpose

56 safe independent practitioner medical expert worker in het healthcare system person scholar

57 what is safe?

58 mastery + skill + competence +…. self regulation

59 self driven –analyses –external information seeking –goal orientation –prioritisation –realisation/attainment –time management 1 Bandura A. social cognitive theory: an agentic perspective. Annual Review Psychology 2001;52: Dochy F, M.Segers, Sluijsmans D. The Use of Self-, Peer and Co-assessment in Higher Education; a review. Studies in Higher Education 1999;24(3): Eva KW, Cunnington JPW, Reiter HI, Keane D, G N. How can I know what I don't know? Poor self assessment in a well-defined domain. Advances in Health Sciences Education 2004;9:

60 The opposite of good is... …well intended

61 perfect assessment program timecompeting demands understandingpatient care motivation costs expertise beliefs laws expectations context culture

62 relevant research findings meaningfulness 1. Posner MI. What is it to be an expert? In: Chi MTH, Glaser R, Farr MJ, editors. The nature of expertise. Hillsdale, NJ, US: Lawrence Erlbaum Associates, Inc, 1988:xxix - xxxvi. 2. Schmidt HG, Boshuizen HP. On acquiring expertise in medicine. Special Issue: European educational psychology. Educational Psychology Review 1993;5(3):

63 learning in context a newspaper is better than a glossy magazine the seashore is better than the street first it is better to run than to walk you will have to try several several times some skills are required but it is easy to learn even small children can enjoy it once successful the risk of complications is minimal birds seldom get too close rain soaks in very fast a rock can serve as an anchor once it breaks loose there is not second chance

64 learning in context: flying a kite a newspaper is better than a glossy magazine the seashore is better than the street first it is better to run than to walk you will have to try several several times some skills are required but it is easy to learn even small children can enjoy it once successful the risk of complications is minimal birds seldom get too close rain soaks in very fast a rock can serve as an anchor once it breaks loose there is not second chance

65 relevant research findings meaningfulness transfer and domain specificity 1. Eva K. On the generality of specificity. Medical Education 2003;37: Eva KW, Neville AJ, G.R. N. Exploring the etiology of content specificity: Factors influencing analogic transfer and problem solving. Academic Medicine 1998;73(10):s1-5.

66 analogous transfer

67 relevant research findings meaningfulness transfer and domain specificity deliberate practice Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Medical Education 2007;41:

68 deliberate practice assessmentfeedbackadjustment

69 feedback concrete constructive focused on improvement ‘connected’ leading to learning goals/learning plans Shute V. Focus on formative feedback. Review of educational research 2008;78(n):

70 loop learning goals learning activities practiceanalysisfeedback

71 relevant research findings meaningfulness transfer and domain specificity deliberate practice self-regulated learning

72 forethought, planning & activation monitoring control reaction & reflection cf. Schunk DH (2005). Self-regulated learning: The educational legacy of Paul R. Pintrich. Educational Psychologist, 40, cognitionmotivation behaviourcontext phases areas cognitionmotivation behaviourcontext cognitionmotivation behaviourcontext cognitionmotivation behaviourcontext

73 relevant research findings meaningfulness transfer and domain specificity deliberate practice self-regulated learning reasoning and decision making 1. Boreham NC. The dangerous practice of thinking. Medical Education 1994;28: Klein G. Naturalistic Decision Making. Human Factors 2008;50(3): Plous S. The psychology of judgment and decision making. New Jersey: McGraw-Hill inc., Schmidt HG, Machiels-Bongaerts M, Hermans H, ten Cate TJ, Venekamp R, Boshuizen HPA. The Development of Diagnostic Competence: Comparison of a Problem-based, and Integrated, and a Conventional Medical Curriculum. Academic Medicine 1996;71(6):

74 relevant research findings reliability validity quality frameworks organisational reliability 1. Williams M, Klamen D, McGaghie W. Cognitive, Social and Environmental Sources of Bias in Clinical Performance Ratings. Teaching and Learning in Medicine 2003;15(4): Kane MT. Validation. In: Brennan RL, editor. Educational Measurement. Westport: ACE/Praeger, 2006: Govaerts MJB. Climbing the pyramid; Towards understanding performance assessment. Maastricht University, Dijkstra J, Van der Vleuten C, Schuwirth L. A new framework for designing programmes of assessment. Advances in health sciences education 2010;15. :379–93

75 CCA direct obs. mcq test mini-release s MSF portfolio mini-release s term 1 mid-termend-term

76 CCA direct obs. audit mcq test MSF mcq test mini-release s term 2 mid-term portfolio end-term

77 critical case analysis 5 write-ups of real patient consultations relevance analysis learning activities produce exam questions (EMI, KFP, MCQ) increasingly original literature any discussion minuted by registrar

78 directly observed consultations 9 real patient consultations relevance analysis learning goals (practical + theoretical) learning activity demonstration of success in next observed consultation discussion minuted by registrar

79 clinical audit analysis of the practice environment determination of specific question collection of data draw conclusions describe plan for change + 3 months: look back and annotate any discussion minuted by registrar

80 multiple-choice tests 3 tests of 60 items each blueprinted sit and submit your answers review items, answer key comment on an criticise questions for correctness present in min-release ‘lodge’ appeal against questions score calculation and feedback to registrars

81 mini-releases flexible agenda building informal networks discuss MCQ test items compile –appeal against questions –list of ‘informal’ network

82 multi-source feedback 2 times per year nurses, practice manager, receptionist, other practice staff and registrar discussed with supervisor (end-term assessment) and with ME (minuted by registrar) simple form: dealing with tasks, other and yourself simple ordinal scale ample room for qualitative comments

83 mid and end-term assessment integrative reviewing all the information learning goals and/or remediation plans advisory to performance review committee minuted by registrar

84 portfolio complete dossier including minutes individual extra information (only if relevant) audit trail basis for future CV or position applications

85 example of a ‘line’ CCA learning MCQs test appeal group appeal feedback analysis feedback meaning test enhanced learning informal/social networks transformation research: narratives for feedback

86 design goals and stated purpose programme in action supporting the programme documentation of the programme improvement approaches to of the programme accounting for the programme Dijkstra J, Van der Vleuten C, Schuwirth L. A new framework for designing programmes of assessment. Advances in health sciences education 2010;15. :379–93.

87 rules and regulation self responsibility comes with accountability (minutes, plagiarism, fraud) focus on learning and remediation information provision to the registrar documentation transparency second opinion/appeals/switch of ME or supervisor organisation reliability/credibility

88 staff development efficiency: –short analyses –concrete learning goals –focus on learning –training of staff (expertise  efficiency) –admin support by admin staff –division of advisory and decision roles

89 further requirements goals and stated purpose programme in action supporting the programme: regulations documentation of the programme: ELO, fact sheets, improvement approaches to of the programme: systematic evaluation accounting for the programme: research

90 Thank you


Download ppt "Programmatic assessment for learning an example of medical education design."

Similar presentations


Ads by Google