Presentation on theme: "Standard setting for clinical assessments Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical."— Presentation transcript:
Standard setting for clinical assessments Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London The Third International Conference on Medical Education in the Sudan
WHAT are we testing in clinical assessments? Clinical competence What is it?
A popular modern model: elements of competence Knowledge ofactual oapplied: clinical reasoning Skills ocommunication oclinical Attitudes oprofessional behaviour Tomorrow’s Doctors, GMC 2003
Knows Knows how Shows how Behaviour~ skills/attitudes Cognition~ knowledge Another popular medical model of competence Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67. Does Professional authenticity
Assessment of competence A review of developments over the last 40 years
Knows 1960: National Board of Medical Examiners in the USA introduced the MCQ MCQs conquered the world Dissatisfaction due to limitation of MCQs Knows Knows how Shows how Does
Knows how 1965: Introduction of PMP Patient Management Problem Knows Knows how Shows how Does
Patient Management Problem Clinical Scenario Action
Knows how 1965: Introduction of PMP oPatient Management Problem Well constructed SBA format MCQs can test the application of knowledge very effectively Knows Knows how Shows how Does
Shows how 1975: Introduction of Objective Structured Clinical Examination (OSCE) OSCEs are conquering the world Knows Knows how Shows how Does
> 2000: emerging new methods WBAs – Workplace-Based Assessments oMini Clinical Examination Exercise oDirect Observation of Practical Procedure oOSATS oMasked standardized patients oVideo assessment oPatient reports oPeer reports oClinical work samples o……… Knows Knows how Shows how Does
Mini CEX (Norcini, 1995) Short observation (15-20 minutes) and evaluation of clinical performance in practice using generic evaluation forms completed by different examiners (cf.
Example of mini-CEX form
DOPS – Direct Observation of Practical Procedure
OSATS – Objective structured Assessment of Technical Skills
WBAs – Workplace-Based Assessments All based on the principle of an assessor observing a student/trainee in a workplace or practice setting
Past 40 years: climbing the pyramid..... Knows Shows how Knows how Does Knows Factual tests: SBA-type MCQs….. Knows how (Clinical) Context based tests: SBA, EMQ, MEQ….. Shows how Performance assessment in vitro: OSCEs Does Performance assessment in vivo: Mini-CEX, DOP, OSATS, …..
Standard setting – why bother? To assure standards o At graduation from medical school o For licensing o For a postgraduate (membership) degree oFor progression from one grade to the next o For recertification
At graduation from medical school To award a medical degree to students who meet the University’s standards (University interest) To distinguish between the competent and the insufficiently competent (Public interest) To certify that graduates are suitable for provisional registration (Regulatory/licensing body interest) To ensure graduates are fit to undertake F1 posts (employer interest)
Definition of Standards A standard is a statement about whether an examination performance is good enough for a particular purpose oa particular score that serves as the boundary between passing and failing othe numerical answer to the question “How much is enough?”
Standard setting All methods described in the literature are based on ways of translating expert (clinical) judgement into a score
‘Classical’ standard setting methods For written test items: oAngoff’s method oEbel’s method For OSCEs: oBorderline group method oRegressions based method
Performance-based standard setting methods Borderline group method Contrasting group method Regression based standard method Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C Comparison of a rational and an empirical standard setting procedure for an OSCE, Medical Education, 2003 Vol 37 Issue 2, Page 132 Kaufman DM, Mann KV, Muijtjens AMM, van der Vleuten CPM. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Acad Med 2000; 75:
The examiner’s role in standard setting Uses the examiner’s clinical expertise to judge the candidate’s performance Examiner allocates a global judgement based on the candidate’s performance at that station Remember the level of the examination Pass Borderline Fail
Borderline Group Method Checklist 1. 2. 3. 4. 5. 6. 7. TOTAL Passing score Borderline score distribution Pass, Fail, Borderline Test score distribution
Contrasting groups method Checklist 1. 2. 3. 4. 5. 6. 7. TOTAL Pass, Fail, Borderline P/B/F Test score distribution Passing score PassFail
Performance-based standard setting Advantages Utilises the expertise of the examiners othey are observing the performance of the students othey are in a position to make a (global) judgement about the performance based on their clinical expertise expected standards for the level of the test knowledge of the curriculum/teaching
Borderline/ contrasting/ regression-based methods Advantages Large number of examiners set a collective standard while observing the candidates – not just an academic exercise Reliable: cut-off score based on large sample of judgments Credible and defensible: based on expert judgment in direct observation
Performance-based standard setting Disadvantages Requires large (-ish) cohort of candidates to achieve enough numbers in the ‘borderline’ group Passing score not known in advance Judgments not independent of checklist scoring Requires expert processing of marks immediately after the exam o Checking of results o Delay in producing results
Work Based Assessment tools No gold standard standard setting method!
Standard setting Standards are based on informed judgments about examinees’ performances against a social or educational construct e.g. competent practitioner suitable level of specialist knowledge/skills
Standard setting for Work Based Assessment tools oBased on descriptors for a particular level of training oInformation gathering relying on descriptive and qualitative judgemental information oDescriptors agreed by consensus/panel of clinical experts oPurpose of WBA tools: formative rather than summative: feedback
Feedback Giving feedback to enhance learning is some form of judgement by the feedback giver on the knowledge and performance of the recipient It is a very powerful tool!
WBAs and feedback Underlying principle of WBA tools is FEEDBACK from oTeacher/supervisor oPeers/team members oOther professionals oPatients
Conclusions It’s not easy to set standards for Work Based Assessments (in the ‘classic’ sense) Expert professional judgement is required Wide sampling from different sources: range of tools, contexts, cases and assessors Feedback to the trainee