Presentation is loading. Please wait.

Presentation is loading. Please wait.

Marishiel Mejia-Samonte, MD

Similar presentations


Presentation on theme: "Marishiel Mejia-Samonte, MD"— Presentation transcript:

1 Marishiel Mejia-Samonte, MD
Mini-CEX: Is it an X? Marishiel Mejia-Samonte, MD

2 The use of a variety of different assessment methods has been characteristic of medical education
Gone are the days when medical knowledge and clinical skills of doctors were assessed using written and oral examinations Written examinations: open-ended questions graded by hand Oral examinations required the student to go to a patient’s bedside, gather the information and then present a diagnosis and treatment plan to assessors who asked questions and made judgements about the performance

3 New methods have been developed focusing on clinical skills like:
Communication A competency that cannot be tested well by written examinations or examinations wherein the student-patient encounter is unobserved New methods have been developed focusing on clinical skills like: History-taking and performing physical examinations Communication skills Procedural skills Professionalism

4 Selection of Assessment Method: Quantifiable
Validity Degree to which the inferences made about medical competence based on assessment scores are correct Reliability or Generalizability Measure of the relative magnitude of variability in scores due to error, with the aim of achieving a desired level of measurement precision

5 Selection of Assessment Method: Not Readily Quantifiable
Educational Effect Capitalizes on the students’ motivation to do well and directs their study efforts in support of the curriculum Feasibility Degree to which the assessment method selected is affordable and efficient for the testing purpose Acceptability Extent to which stakeholders in the process (students, faculty, patients) endorse the measure and the associated interpretation of scores

6 Simulations Increasingly being used in medical education to ensure that examinees can demonstrate integration of prerequisite knowledge, skills and affect in a realistic setting Standardized patients Computer-based simulation Computer programs Model-driven simulations Virtual reality

7 Issues Associated with Simulations
Fidelity Equivalence Standardization Reliability Case-Generation Security

8 Simulations provides a means of beginning to assess these skills, but real patients often have…
more complex problems more acutely ill demand more skill than can be simulated by modern technology

9 Work-Based Assessment
Educational mission protect the safety of patients provide the opportunity for educational feedback to the trainee Educational mission dictates that the methods chosen for assessment protect the safety of patients and provide the opportunity for educational feedback to the trainee

10 Why Choose Work-Based Assessment
Training of doctors occurs in the setting of patient care To ensure that the type and complexity of patient-care problems that doctors face during training are the same as those encountered in practice is both a challenge and an opportunity

11 Work-Based Assessment
Trainees confront a broad array of health care problems and just like doctors in practice they are required to integrate all of their skills in response Those involved in the actual development of the assessment systems will need to consider a number of issues

12 Designing a System of Assessment
Step 1: Define the content to be assessed, and the focus of the assessment Step 2: Define the purpose of assessment Step 3: The blue printing process Step 4: Choose or develop methods Step 5: Train assessors Step 6: Standard setting Step 7: Reporting and review system

13 Mini Clinical Evaluation Exercise (Mini-CEX)
a method for simultaneously assessing the clinical skills of trainees and offering them feedback on their performance simple modification of the traditional bedside oral examination relies on the use of real patients and the judgments of skilled clinician educators

14 How does the original CEX works?
a faculty member observes the trainee interact with a patient in any of a variety of settings the trainee conducts a focused history and physical examination and after the encounter provides a diagnosis and treatment plan the faculty member scores the performance using a structured document and then provides educational feedback

15 How does the mini-CEX works?
Relatively short, about 15 minutes Occur as a routine part of the training program Several different occasions by different faculty examiners Proper representation of different clinical problems Each of the encounters should represent a different clinical problem, appropriately sampled from the list of patient problems

16 Competence Descriptor of a Satisfactory Trainee History Taking Facilitates patient’s telling of a story Effectively uses appropriate questions to obtain accurate, adequate information Responds appropriately to verbal and nonverbal cues Physical Examination Follows efficient, logical sequence Examination is appropriate to the clinical problem Explains to the patient Sensitive to the patient’s comfort Modesty

17 Competence Descriptor of a Satisfactory Trainee Professionalism Shows respect, compassion, empathy Establishes trust Attends to patient’s needs of comfort, respect, confidentiality Behaves in an ethical manner Awareness of relevant legal frameworks Aware of limitations Clinical Judgement Makes appropriate diagnosis and formulates a suitable plan Selectively orders/ performs appropriate diagnostic studies Considers risks and benefits

18 Competence Descriptor of a Satisfactory Trainee Communication Skill Explores patient’s perspective Jargon free Open and honest Empathetic Agrees management plan/ therapy with patient Organization/ Efficiency Prioritizes Timely Summarizes

19 Competence Descriptor of a Satisfactory Trainee Overall Clinical Care Demonstrates satisfactory clinical judgement, synthesis, caring, effectiveness Efficiency Appropriate use of resources Balances risks and benefits Awareness of own limitations

20 Strengths of the CEX Trainee’s performance with a real patient
Skilled clinician-educator who both assesses the performance and provides educational feedback Complete and realistic clinical challenge It evaluates the trainee’s performance with a real patient. In medical school, the Objective Structured Clinical Examination (OSCE) is often used and it does an excellent job of assessing clinical skills. As trainees approach entry to practice, however, their education and assessment needs to be based on performance with real patients who exhibit the full range of conditions seen in the clinical setting. The trainee is observed by a skilled clinician-educator who both assesses the performance and provides educational feedback. This enhances the validity of the results and ensures that the trainee receives the type of constructive criticism that should result in a reduction of errors and an improvement in quality of care. The CEX presents trainees with a complete and realistic clinicalchallenge.Theyhavetoget all of the relevant information from the patient, structure the problem, synthesise their findings, create a management plan, and communicate this in both oral and written form.

21 Weaknesses of the CEX Standards to follow Alternative assessments
Selection of assessors Equivalence The research showed that trainees’ performances with one patient were not a very good predictor of their performances with other patients. Consequently, they needed to be observed on different occasions with different patients before drawing reliable conclusions about their competence. Observing each trainee with several patients was also desirable from an educational perspective, since different patients require different skills from trainees and this significantly broadens the range and richness of feedback they receive. The research showed that the assessors did not agree with each other even when they were observing exactly the same performance. Training of assessors is helpful to some degree but much larger improvements in the reliability and validity of the ratings was achieved by including different faculty members in the overall assessment of each trainee. This was also useful from the perspective of education, since trainees received feedback from different assessors, each with their own specialties, strengths, and perspectives. In terms of the method itself, the CEX focused on the trainee’s ability to be thorough with a single new patient in a hospital setting that is uninfluenced by time constraints. In contrast, different patients pose different challenges and the tasks or competencies required of doctors vary considerably depending on the setting in which care is rendered. Further, most patient encounters are much shorter than two hours so the CEX does not assess the trainee’s ability to focus and prioritize diagnosis and management.

22 Evidences on the Utility of the Mini-CEX
Reliability Eight (8) to fourteen (14) raters The original studies showed a reliability of 0.8 with raters, more recent studies have shown reliable results with as few as eight raters Narrow standard error of measurement (SEM) suggesting that those who have high (or low) scores initially may need as few as four encounters and further assessment can be focused on trainees with borderline results. Norcini demonstrated good inter-rater reliability, with no large differences in ratings between examiners and across settings.

23 Evidences on the Utility of the Mini-CEX
Validity Good face validity Involves the observation of a real patient encounter in a real clinical environment Able to differentiate between levels of experience Scores do improve over time More experienced trainees receive higher ratings

24 Evidences on the Utility of the Mini-CEX
Educational Impact There is some evidence that mini-CEX promotes deep learning and encourages self reflection Its educational effect is based on a significant increase in the number of occasions on which trainees are directly observed with patients and offered feedback on their performance.

25 Cautions on the Utility of the Mini-CEX
Higher ratings for more complex cases Faculty ratings are lower than resident’s ratings of students Not intended for use in high stakes assessments and should not be used to rank or compare candidates There may be a significant halo effect with a high correlation between scores achieved on individual competencies. Care needs to be taken when interpreting the results of a mini-CEX instrument which attempts to assess multiple distinct domains of performance.

26 Cautions on the Utility of the Mini-CEX
Primary purpose: provide an opportunity to observe the trainee’s clinical skills give constructive feedback Assessors need to be trained in the use of the mini-CEX assessment method. Primary purpose: to provide an opportunity to observe the trainee’s clinical skills (which otherwise happens rarely) and give constructive feedback. For this to happen effectively both the assessor and the trainee need to be familiar with the assessment instrument and the assessor needs to be both trained and competent in the procedure/skill they are assessing (in order to be able to make a judgement) and trained in how to give feedback.

27 Factors Influencing Rater Judgements
Intrinsic Factors Gender Experience or expertise Clinical skills Bias Rater confidence Judgment-making Factors Conceptualization Interpretation Attention Impressions Lee, V. et al. Academic Medicine

28 Factors Influencing Rater Judgements
External factors Specialty, encounter setting, and factors related to doctor, patient, and consultation. Differences in case specificity: ↑ scores with ↑ duration or ↑ complexity Local rater bias Assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings Rater training alone In relation to translating judgments into scores, we found articles that described scoring integration (i.e., the assimilation of the different miniCEX domains into an overall score) and domain differentiation (i.e., the ability to distinguish different dimensions of performance across the mini-CEX domains).. External factors Firstly, scores varied according to specialty,21 encounter setting,2,22,32 and factors related to doctor, patient, and consultation.6 The authors of this last study6 also found that raters struggled with how to take contextual influences (e.g., consultation complexity, non-native-speaking patient, presentation of multiple complaints) into account when assessing communication skills. Secondly, differences in case specificity influenced mini-CEX scores; that is, higher scores were associated with encounters of increased duration32 or higher complexity.2,22,32 Thirdly, one set of investigators using a direct observation tool similar to the mini-CEX for the Neurology Clinical Examination found evidence of local rater bias; specifically, local faculty were less likely to fail residents than external faculty.33 Fourthly, Kogan and colleagues19 reported that the assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings. Finally, as previously mentioned, rater training alone appeared to have conflicting results on improving the interreliability of faculty ratings.27 Lee, V. et al. Academic Medicine

29 Factors Influencing Rater Judgements
Scoring Factors Scoring integration Domain differentiation In relation to translating judgments into scores, we found articles that described scoring integration (i.e., the assimilation of the different miniCEX domains into an overall score) and domain differentiation (i.e., the ability to distinguish different dimensions of performance across the mini-CEX domains).. External factors Firstly, scores varied according to specialty,21 encounter setting,2,22,32 and factors related to doctor, patient, and consultation.6 The authors of this last study6 also found that raters struggled with how to take contextual influences (e.g., consultation complexity, non-native-speaking patient, presentation of multiple complaints) into account when assessing communication skills. Secondly, differences in case specificity influenced mini-CEX scores; that is, higher scores were associated with encounters of increased duration32 or higher complexity.2,22,32 Thirdly, one set of investigators using a direct observation tool similar to the mini-CEX for the Neurology Clinical Examination found evidence of local rater bias; specifically, local faculty were less likely to fail residents than external faculty.33 Fourthly, Kogan and colleagues19 reported that the assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings. Finally, as previously mentioned, rater training alone appeared to have conflicting results on improving the interreliability of faculty ratings.27 Lee, V. et al. Academic Medicine

30 Tamara. Family Medicine

31 Feedback Basic teaching methods used in clinical settings
General complaint from medical students and residents is, ‘‘I never receive any feedback.’’ Explanations for this perceived lack of feedback: actual lack of feedback students’ not realizing that they have been getting feedback problems with data collection on feedback received by students Hypothesis is that clinicians do not appreciate the role of feedback as a fundamental clinical teaching tool, and do not recognize the many opportunities for using that tool.

32

33

34 Designing a System of Assessment
Step 1: Define the content to be assessed, and the focus of the assessment Step 2: Define the purpose of assessment Step 3: The blue printing process Step 4: Choose or develop methods Step 5: Train assessors Step 6: Standard setting Step 7: Reporting and review system

35


Download ppt "Marishiel Mejia-Samonte, MD"

Similar presentations


Ads by Google