Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment of Knowledge and Performance

Similar presentations


Presentation on theme: "Assessment of Knowledge and Performance"— Presentation transcript:

1 Assessment of Knowledge and Performance
John Littlefield, Ph.D. University of Texas Health Science Center at San Antonio

2 Goals: Assessment of Knowledge and Performance
Clarify two distinct uses for assessments of clinical knowledge and performance Define two aspects of validity for all assessment methods Compare and contrast three techniques for assessing clinical knowledge and performance Identify poorly written multiple choice test items Write a key features test item Describe a role for narrative comments in scoring interactions with Standardized Patients Describe three elements of a clinical performance assessment system Critique a clinical performance assessment system that you use

3 Agenda: Assessment of Knowledge and Performance
Exercise: Warm-up for assessing knowledge and performance Presentation: Quality assurance when assessing clinical knowledge and performance Exercise: Take then critique a multiple choice test Presentation: Key features test items Exercise: Write a key features test item Presentation: Widening the lens on SP assessment Exercise: Recommend program director actions based on faculty comments about a resident Presentation: Improving clinical performance assessment systems Exercise: Critique your clinical performance assessment system

4 Warm-up Exercise What answer did you circle to questions 1.a and 1.b?
What adjectives did you list for the marginal student/resident? What concerns do you have about assessing knowledge and performance that you would like addressed?

5 Uses for Assessment: Formative vs. Summative
Purpose Feedback for Certification/Grading Learning Breadth of Narrow Focus on Broad Focus on Scope Specific Objectives General Goals Scoring Explicit Feedback Overall Performance Learner Affective Little Anxiety Moderate to High Response Anxiety Target Audience Learner Society example: learning to drive vs... taking a driving test faculty often want to conduct both types of evaluations at the same time

6 Validity of Knowledge and Performance Assessments 1
Content - Does the assessment method measure a representative cross-section of student/resident competencies? Internal structure – Do content and scoring focus on a specific clinical competency (e.g., patient care)? Relation to other assessments - Do scores from this assessment correlate highly with other measures of same student competency? Consequences - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment? Generalizability Does the student perform at about the same level across 5 to 7 different patients / case problems? Does the student receive a similar rating from different faculty? Cognitive process – the context surrounding the assessment evokes the domain of cognitive processing used by a physician 1. Standards for Educ. & Psych. Testing, AERA, APA, NCME, 1999, p

7 Six Aspects of Assessment Validity Viewed as a Cube
Generalizability Relation to other assessments Internal structure Consequences Content Cognitive process

8 Generalizability of Physician Performance on Multiple Patients

9 Validity of Knowledge and Performance Assessments 1
Content - Does the assessment method measure a representative cross-section of student/resident competencies? Internal structure – Do content and scoring focus on a specific clinical competency (e.g., patient care)? Relation to other assessments - Do scores from this assessment correlate highly with other measures of same student competency? Consequences - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment? Generalizability Does the student perform at about the same level across 5 to 7 different patients / case problems? Does the student receive a similar rating from different faculty? Cognitive process – the context surrounding the assessment evokes the domain of cognitive processing used by a physician 1. Standards for Educ. & Psych. Testing, AERA, APA, NCME, 1999, p

10 Cognitive Process Aspect of Validity: Four Levels of Performance Assessment 1
Does (Global Rating) Shows How ( OSCE) Knows How (Examination – Oral) Knows (Examination – Multiple-choice) Miller, GE. Assessment of clinical skills/competence/performance, Academic Medicine, 65(9), supplement, S63-7, 1990

11 Internal structure +++ ++ + Rel. to other assessments + + +
Compare and Contrast Three Assessment Techniques (multiple choice exam, OSCE, performance appraisals) M.C.E OSCE Perf. App. Content Internal structure Rel. to other assessments Consequences Generalizability 5 to 7 case problems agreement among raters Cognitive process + = adequate ++ = good +++ = excellent

12 Interim Summary of Session
Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all assessment techniques Compare and contrast 3 assessment techniques Coming up Take and critique a 14 item multiple choice exam Presentation on Key Features items

13 How are Multiple Choice Items Selected for an Exam?

14 Sample Exam Blueprint based on Clinical Problems
Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

15 Key Features of a Clinical Problem
Definition: Critical steps that must be taken to identify and manage a patient’s problem focuses on a step in which examinees are likely to make an error is a difficult aspect in identifying and managing the problem Example: For a pregnant woman experiencing third-trimester bleeding with no abdominal pain, the physician should: generate placenta previa as the leading diagnosis avoid performing a pelvic examination (may cause bleeding) avoid discharging from clinic or emergency room order coagulation tests and cross-match Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

16 Test Items based on a Clinical Problem and its Key Features
Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

17 Scoring the Placenta Previa Clinical Problem
Key Feature 1: To receive one point, must list placenta previa or one of the following synonyms: marginal placenta or low placental insertion Key Features 2-4: Receive 1/3 point for listing each of the following: 1. Avoid performing a pelvic exam, 2. Avoid discharging from clinic, 3. Order coagulation tests and cross match Total Score for Problem: Add scores for items 1 and 2 and divide by 2 (range: )

18 Steps to Develop a Clinical-Problem Based Exam
Define the domain of clinical problems to be sampled by the exam Develop an exam blueprint to guide selection of clinical problems Develop a key-feature problem for each clinical problem selected define clinical situation for the problem (e.g. single typical problem, life-threatening situation etc.) define key features of the problem select a clinical case to represent the problem and write scenario write exam items for case; in general one item for each key feature select suitable format for each item (e.g., write-in or mcq) develop scoring key for each item pilot test items for item analysis data to guide refinement

19 Interim Summary of Session
Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all assessment techniques Compare and contrast three assessment techniques Take and critique a 14 item multiple choice exam Write a Key Features item Coming up Scoring performance on an SP exam

20 Schematic Diagram of a 9 Station OSCE
Start 1 2 3 4 5 9 End 8 7 6

21 Widening the Lens on SP Assessment 1
Traditional scoring of SP assessment focuses on numerical data typically from checklists Dimensions of the SP exam basic science knowledge (organize the information) physical exam skills (memory of routines) establishing a human connection role of the student (appear knowledgeable) existential dimension of the human encounter (balance one’s own beliefs with the patient’s) Clinical competence – mixture of knowledge and feeling, information processing and intuition 1. Rose, M. & Wilkerson, L. Widening the Lens on Standardized Patient Assessment: What the Encounter Can Reveal about the Development of Clinical Competence, Acad. Med. 76(8), 2001.

22 Interim Summary of Session
Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all assessment techniques Compare and contrast three assessment techniques Take and critique a 14 item multiple choice exam Write a Key Features test item Use narrative comments as part of assessment via SP Coming up Improving clinical performance assessment systems

23 Dr. Tough’s Memo regarding Dr. Will E. Makit (PGY 2)
“The performance of Dr. Makit in General Surgery this month has been completely unsatisfactory. Every member of the clinical faculty who has had any contact with him tells me of his gross incompetence and irresponsibility in clinical situations. This person is an embarrassment to our school and our training program. I spoke to him about his performance after I talked with you several weeks ago and he told me that he would improve it. There was no evidence that he made any effort to improve. There is no way this can be considered a completed or satisfactory rotation in General Surgery. In fact, he is the most unsatisfactory resident who has rotated through our service in the last five years, and his behavior is an appalling example to the rest of our housestaff.” ************************************************************* Your 1. Refer the problem to the Resident Education Action? Committee for an administrative decision. 2. Assign Dr. Makit to a rotation with Dr. Insightful as the Attending Faculty.

24 Dr. Insightful’s Phone Comments regarding Dr. Makit

25 Resident Performance Assessment System
Organizational Infrastructure

26 A. Department’s Organizational Infrastructure
Department head emphasizes completing and returning PA forms Consequences for evaluators who don’t complete PA forms PA form is brief ( < 10 competencies) Don’t request pass/fail judgment by individual faculty Evaluators trained to use PA form & criteria Evaluators believe they will be supported when writing honest appraisals Specific staff assigned to monitor compliance in returning forms Program Director alerted immediately when a returned form reflects cautionary info Organizational Infrastructure

27 B. Elements of Individual Evaluator Role in PA System
Communicate expectations Observe performance Interpret and judge performance Communicate performance information Coach resident Complete performance appraisal form

28 B. Evaluator Role: Communicate Expectations and Observe Performance
Consensus among evaluators about service and education expectations Residents are crystal clear about service and education expectations 2. Observe Performance Evaluators observe resident multiple times before completing PA form Appraise only performance directly observed Other staff (e.g., nurses) complete PA forms Communicate Expectations Observe Performance

29 B. Evaluator Role: Interpret and Judge Performance
Evaluators agree on what behaviors constitute outstanding, ‘average’, and marginal performance When facing a marginal resident, evaluators record rationale for their judgment and info surrounding the event Evaluators record their interpretation of the performance soon after behavior occurs diagnose performance (quality wnl ?)

30 B. Evaluator Role: Coach Resident
Evaluators aware of difference between corrective feedback, criticism and compliments Faculty actively coach residents in timely manner Residents are encouraged to ask for feedback Evaluators regularly invite self-assessment from residents before giving feedback Coach Resident

31 Communicate performance information (to whom ?)
B. Evaluator Role: Communicate Performance Information and Complete PA Form 5. Communicate Performance Info Communicate incidents that are significantly negative or positive Document in writing even single instances of poor or inappropriate behavior 6. Complete PA Form Evaluators write specific narrative comments on PA forms Evaluators forward completed PA forms to Director in timely way Communicate performance information (to whom ?) Complete PA Form

32 C. Elements of Program Director Role in PA System
Monitor and interpret performance appraisals Committee decision Inform resident

33 C. Program Director’s Role: Monitor and Interpret Appraisals
Recognize evaluator rating patterns (stringent vs. lenient) to accurately interpret PA Contact evaluators to elicit narrative info when absent to substantiate a marginal PA Store PA forms in residents’ files in a timely manner Summarize PA data to facilitate decision making by Resident Education Committee Keep longitudinal records of PA data to develop norms for the PA form - how are ratings used in the dept? In many depts., a detailed rating form is used to provide feedback to the ratee, but the evaluation committee only looks at the numerical ratings without any comments - how many have rating forms that include 10 or more dimensions of performance? Do the raters give you 10 separate ratings or do they tend to give each ratee either all positive or all negative ratings? In PA this used to be called Halo error. - this phenomenon occurs very frequently; 5 studies have found that 2 dimensions can adequately describe ratings of physician performance (show visual); failure to discriminate may mean that the rating form doesn’t help the rater communicate her perceptions of ratee performance - to make a high stakes decision, one should be confident that the confidence interval around the score is relatively small; independent ratings will typically result in a relatively precise score; Ramsey reference in your reading list provides more info. - the most important criterion of a functional PA system is whether it provides documentation to make a difficult negative decision

34 C. Program Director’s Role: Committee Decision
PA decisions are a collaborative process involving multiple faculty Seven or more PA forms per resident are available when admin decisions are made Sufficient written narrative documentation is available when admin decisions are made committee decision - how are ratings used in the dept? In many depts., a detailed rating form is used to provide feedback to the ratee, but the evaluation committee only looks at the numerical ratings without any comments - how many have rating forms that include 10 or more dimensions of performance? Do the raters give you 10 separate ratings or do they tend to give each ratee either all positive or all negative ratings? In PA this used to be called Halo error. - this phenomenon occurs very frequently; 5 studies have found that 2 dimensions can adequately describe ratings of physician performance (show visual); failure to discriminate may mean that the rating form doesn’t help the rater communicate her perceptions of ratee performance - to make a high stakes decision, one should be confident that the confidence interval around the score is relatively small; independent ratings will typically result in a relatively precise score; Ramsey reference in your reading list provides more info. - the most important criterion of a functional PA system is whether it provides documentation to make a difficult negative decision

35 C. Program Director’s Role: Inform Resident
Residents are given a summary of their performance every six months Evaluators have written guidelines outlining what must legally be in a probation letter Evaluators know what documentation is needed to ensure adequate due process Each resident receives an end of program performance evaluation inform resident

36 Formative Evaluation: Diagnostic Checklist for Resident Performance Assessment System

37 Research: Improving Resident Performance Appraisals 1
Organizational Infrastructure Discussed PA problems at department meetings Appointed a task force to review PA problems and propose solutions Revised old appraisal form Pilot-tested and adapted the new appraisal form Evaluator Role Provided evaluators with examples of behaviorally-specific comments Results Increased # of forms returned, # forms with behaviorally-specific comments, and # of administrative actions by prog. 1. Littlefield, J. and Terrell, C. Improving the quality of resident performance appraisals, Academic Medicine, 1997: 72(10) Supplement, S45-47.

38 Research: Improving Written Comments by Faculty Attendings1
Organizational Infrastructure Conducted a 20 minute educational sessions on evaluation and feedback 3 by 5 reminder card and diary Results Increased written comments specific to defined dimensions of competence Residents rated quantity of feedback higher and were more likely to make changes in clinical management of patients 1. Holmboe, E., et.al. Effectiveness of a focused educational intervention on resident evaluations from faculty. J. Gen Intern Med. 2001: 16:

39 Research: Summative Evaluation of a PA System1,2
Research Question Pre-intervention Post-intervention 1. % rotations return PA forms? mean: 73% range: % mean: 97% range: % 2. % PA forms communicate performance problems? 3.6% (n = 17/479) 5.9% (n = 64/1085) 3. Probability Program will take admin action? .50 (5/10) .47 (8/17) 4. Reproducibility of numerical ratings? Gen. coef.= .59 (10 forms) Gen. coef.= .80 1. Littlefield, J.H., Paukert, J., Schoolfield, J. Quality assurance data for resident global performance ratings, Academic Med., 76(10), supp., S102-04, 2001. 2. Paukert, J. et. al., Improving quality assurance data for resident subjective performance assessment, manuscript in preparation

40 Recall a medical student or resident whose performance made you uneasy.
What behavior or event made you uneasy? What action did you take? a. Talk with faculty colleagues about your concerns b. Write a candid performance appraisal and send it to the course/residency director 3. If you wrote a candid appraisal, did an administrative action occur related to the student/ resident?

41 Goals: Assessment of Knowledge & Performance
Clarify two distinct uses for assessments of clinical knowledge and performance Define two aspects of validity for all assessment methods Compare and contrast three techniques for assessing clinical knowledge and performance Identify poorly written multiple choice test items Write a key features test item Describe a role for narrative comments in scoring interactions with Standardized Patients Describe three elements of a clinical performance assessment system Critique a clinical performance assessment system that you use


Download ppt "Assessment of Knowledge and Performance"

Similar presentations


Ads by Google