Presentation is loading. Please wait.

Presentation is loading. Please wait.

Training the OSCE Examiners

Similar presentations


Presentation on theme: "Training the OSCE Examiners"— Presentation transcript:

1 Training the OSCE Examiners
Katharine Boursicot Trudie Roberts

2 Programme Principles of OSCEs for examiners Video marking
Marking live stations Strategies for enhancing examiner participation in training

3 Academic principles of OSCEs
The basics What is an OSCE? More academic detail Why use OSCEs? The role of examiners Examiners in OSCEs

4 The basics For examiners who don’t know about OSCEs
A brief reminder for those who are familiar with OSCEs

5 What is an OSCE? Objective Structured Clinical Examination

6 OSCE test design Station

7 OSCEs - Objective All the candidates are presented with the same test

8 OSCEs - Structured The marking scheme for each station is structured
Specific skill modalities are tested at each station History taking Explanation Clinical examination Procedures

9 OSCEs – Clinical Examination
Test of performance of clinical skills: not a test of knowledge the candidates have to demonstrate their skills

10 More academic detail Why use OSCEs in clinical assessment?
Improved reliability Fairer test of candidate’s clinical abilities

11 Why use OSCEs in clinical assessment?
Careful specification of content Observation of wide sample of activities Structured interaction between examiner and student Structured marking schedule Each student has to perform the same tasks

12 Characteristics of assessment instruments
Utility = Reliability Validity Educational impact Acceptability Feasibility Reference Van der Vleuten, C. The assessment of professional competence: developments,research and practical implications Advances in Health Science Education 1996, Vol 1: 41-67

13 Test characteristics Reliability of a test/ measure
reproducibility of scores across raters, questions, cases, occasions capability of differentiating consistently between good and poor students

14 Sampling Domain of Interest Test Sample Test Sample

15 Reliability Competencies are highly domain-specific
broad sampling is required to obtain adequate reliability across content i.e. range of cases/situations across other potential factors that cause error variance i.e. testing time, examiners, patients, settings, facilities

16 OSCE : blueprint History Explan Exam Procedure CVS RS GIS Repro NS MS
Chest pain Disch drugs Cardiac BP RS Haemoptysis Smoking Resp Peak flow GIS Abdo pain Gastroscopy Abdo PR Repro Amenorrhoea Abnormal smear Cx smear NS Headache Eyes Ophthalmosc MS Backache Hip Generic Pre-op assess Consent for post mortem IV cannulation Blood trans rea

17 Test characteristics Validity of a test/measure
the test measures the characteristic (eg knowledge, skills) that it is intended to measure

18 Model of competence Does Shows how Knows how Knows Behaviour~
skills/attitudes Does Professional authenticity Shows how Knows how Cognition~ knowledge Knows Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.

19 Validity of testing formats
Does Professional practice assessment Shows how Performance assessment: OSCEs, long/short cases, OSLERs, etc Knows how Problem-solving assessment: EMQs, SEQs Knows Knowledge assessment: MCQs

20 Test characteristics: Educational impact
Relationship between assessment and learning Curriculum Assessment Teacher Student

21 Test characteristics Feasibility cost human resource
physical resources

22 Test characteristics Acceptability tolerable effort reasonable cost
doctors licensing bodies employers patients/consumer groups students faculty

23 The role of examiners in OSCEs
General Types of stations Standard setting Practice at marking

24 The role of examiners in OSCEs
To observe the performance of the student at a particular task To score according to the marking schedule To contribute to the good conduct of the examination

25 The role of examiners in OSCEs
It is NOT to: Conduct a viva voce Re-write the station Interfere with the simulated patient’s role Design their own marking scheme Teach

26 Types of OSCE stations History taking Explanation Clinical examination
Procedures

27 Communication skills Stations involving patients, simulated patients or volunteers Content vs process i.e what the candidate says vs how the candidate says it

28 Clinical skills People Manikins Professional behaviour
Describe actions to the examiner

29 The examiner’s role in standard setting
Use your clinical expertise to judge the candidate’s performance Allocate a global judgement on the candidate’s performance at that station Remember the level of the examination

30 Global scoring Excellent pass Very good pass Clear pass Borderline
Clear fail

31 Borderline method Checklist Passing score Test score distribution
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL Borderline score distribution Pass, Fail, Borderline P/B/F Passing score

32 Regression based standard
Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL X = passing score Checklist Score X Overall rating 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = V Good pass 5 = Excellent pass Clear Borderline Clear V Good Excellent fail pass pass pass

33 Practice at marking Videos Live stations Mini-OSCE

34

35 Strategies for enhancing examiner participation
CME Job plan/ part of contract Specific allocation of SIFT Experience for post-graduate examinations Payment


Download ppt "Training the OSCE Examiners"

Similar presentations


Ads by Google