Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September.

Similar presentations


Presentation on theme: "“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September."— Presentation transcript:

1 “Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona

2 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona ABEM Certification Process Complete residency in Emergency Medicine Pass written certification examination Pass oral certification examination, a series of simulated patient encounters

3 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Purpose of ABEM’s Oral Certification Examination Assess clinical performance Test the application of knowledge of Emergency Medicine

4 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Structure of ABEM’s Oral Certification Examination 7 simulations based on actual clinical cases –1 field test simulation –4 single patient encounters –2 multiple patient encounters One-on-one; 7 different examiners Examiner introduces each case and may play role of patient, nurse, consultant, etc.

5 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona

6 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Scoring Done by the examiner during and immediately after the session

7 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Examiner Qualifications ABEM diplomate at least 5 years Residency trained in EM (ACGME- or RCPSC-approved) Actively involved in the practice of clinical Emergency Medicine Nominated in writing by current examiner, director, or senior director

8 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Examiner Qualifications, cont. Evaluated and recommended by ABEM’s Test Administration Committee Appointed by the Board of Directors Distinguished for high quality patient care, teaching, research, or leadership

9 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Examiner Training Before Exam Focus on standardizing the delivery and scoring of each case Demonstrations Training video Scoring practice with feedback Case presentation practice with feedback and coaching The “what-ifs”

10 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Examiner Training During Exam Observe a real simulation first Written materials support each case Observed by experienced examiner early End of 1 st day – group discussion and individual feedback, coaching Scoring sheets and notes reviewed by chief examiners Ongoing discussions, feedback, mentoring

11 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Standardized Ratings 8 performance criteria  Scale = 1 to 8 Critical actions  Yes/No Dangerous action

12 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Performance Criteria Data Acquisition Problem Solving Patient Management Resource Utilization Health Care Provided (Outcome) Interpersonal Relations Comprehension of Pathophysiology Clinical Competence (Overall)

13 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona

14 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona

15 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Final Score and Pass/Fail Only performance criteria ratings are used Two ways to pass: 1.Grand mean of all performance criteria scores > Case score = mean of performance criteria for each case. Highest and lowest case scores are averaged. If the hi-lo average AND all of the remaining case scores > 5.0, pass

16 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Example 1.Grand Mean Standard  Sum of performance criteria ratings = 375  Number of ratings (8x4) + (18x2) = 68  Grand mean = 375/68 = > 5.75? NO → Fail

17 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Example 2.Case Score Average (High-Low) Standard  Mean of performance ratings for each case = 4.75, 5.23, 5.42, 5.75, 5.83,  High-Low Mean = ( )/2 = 5.42 gold Are the figures in gold all > 5.00? Yes → Pass

18 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Why Two Ways to Pass? Standard setting study + Bayesian procedures → 5.75 BUT a 5+ is “acceptable performance” AND there is potential for measurement error, fluke, etc. So = Pass 5

19 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona A High Quality Exam Interrater Reliability 97% agreement on Critical Actions 95% agreement on “Acceptable/Unacceptable” 94% of all performance criteria ratings within 1 point Discriminant Validity 1. Residency trained physicians 2. Physicians not trained via residency 3. Residents beginning 2 nd year of residency 4. 4 th year medical students Correlation with written MC exam =.77 Predictive Validity Oral exam predicts performance better than written exam does

20 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Benefits of Scoring Procedure Stable pass rates High involvement of EM community Checks and balances Standardized, yet flexible assessment

21 Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Speaker Contact Information Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine East Lansing, MI


Download ppt "“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September."

Similar presentations


Ads by Google