Presentation on theme: "Ramesh Mehay Programme Director (Bradford VTS)"— Presentation transcript:
1 Ramesh Mehay Programme Director (Bradford VTS) nMRCGP in a nutshellOriginally written 2007, updated Jan 2009
2 Aims and objectives Aims Objectives Increase our understanding of nMRCGPHelp us feel more prepared for the assessmentsAnd therefore feel better!ObjectivesProvide an overview of nMRCGPShare understandingShare concerns (and address them?)Practise COT
3 Session plan Overview of the MRCGP and its components Share fears and concernsPractise some COTs in groups?modelling(IS2 – practise some CBDs)
4 Background to nMRCGP nMRCGP replaces both old MRCGP and SA Based on new GP curriculumnew curriculum developed by reviewing literature, very extensive consultation with doctors and patients, etcAll components of nMRCGP are mapped to the competencies in the curriculumGP training now overseen by PMETB, like all other medical specialties (JCPTGP is dead)
6 Components AKT (Applied Knowledge Test) machine marked test, 3x/year, at various venuesCSA (Clinical Skills Assessment)OSCE-type exam, 3x/ year, CroydonWPBA (Workplace Based Assessment)recorded in e-portfolio held by GP trainee throughout the 3 years
7 Clinical Skills Assessment ‘Integrative assessment’ with 3 domainsData gathering, technical and assessmentClinical managementInterpersonal skills13 stations, 10 mins each, balanced selection of casesclear pass, marginal pass, marginal fail, clear fail, ‘serious concerns’significant failure ratetake early enough to have time to retake‘Integrative assessment’ with 3 domainsData gathering, technical and assessmentClinical managementInterpersonal skills13 stations, 10 mins each, balanced selection of casesGraded as clear pass, marginal pass, marginal fail, clear fail, ‘serious concerns’Likely to have significant failure rateNeed to take early enough to have time to retake if necessary
8 Work Place Based Assessment: WBPA Workplace assessment: the assessment of actual working practices undertaken in the working environment
9 Overview of WBA What the trainee actually does Competencies demonstrated ‘when ready’Assessment of developmental progression - guides decisions about future learningRecorded in an electronic portfolioProcess is learner led - trainee has to ensure their e-portfolio covers the e-curriculum
10 WBA: compulsory components Case Based Discussion (CBD)Consultation Observation Tool (COT) or Mini-Clinical Evaluation Exercise (Mini CEX)Multi Source Feedback (MSF)Patient Satisfaction Questionnaire (PSQ)Direct Observation of Procedural Skills (DOPS)
11 WBA: local subunits OOH work booklet Clinical Supervisor’s Report (CSR)Naturally Occurring Evidence (NOE)Significant Event Review (SER)Referrals analysisAudit(Case Review, Personal Learning, Complaints)
12 Who makes judgements?The Trainer/Clinical Supervisor as (s)he does the assessmentsEducational Supervisor as he reviews the ‘whole’ thing with the traineeARCP panels who review the whole thing when a trainee is moving up an ST grade
13 Case based Discussion (CBD) Structured interview designed to explore professional judgement in clinical casesProfessional judgement = ability to make holistic, balanced and justifiable decisions in situations of complexity and uncertaintyAttributes tested:Application of medical knowledgeApplication of ethical frameworksAbility to prioritise, consider implications, justify decisionsRecognising complexity and uncertaintyStructured interview designed to explore professional judgement in clinical casesProfessional judgement = ability to make holistic, balanced and justifiable decisions in situations of complexity and uncertaintyAttributes tested:Application of medical knowledgeApplication of ethical frameworksAbility to prioritise, consider implications, justify decisionsRecognising complexity and uncertainty
14 CBD Competency areas CBD looks at 10 of the 12 competencies Practising holisticallyData gathering and interpretationMaking decisions/diagnosesClinical managementManaging medical complexityPrimary Care Administration (IMT)Working with colleaguesCommunity orientationMaintaining an ethical approachFitness to practice(not assessed by CBD: communication skills AND maintaining performance/learning/teaching)
15 CBD - the processTrainee selects 3 cases, gives material to trainer 1w in advanceNeed balance of cases and contextsTrainer selects 2, and plans structured questions in advance1h session = cover 2 cases20mins case, 10mins feedbackTrainer records evidence and judges level of performance(insuff evid/needs devel/competent/excellent)Need to do a MINIMUM of 6 per postAll 6 before the ES meeting! (really, within 4m)
16 Key Points on CBD It is a STRUCTURED oral interview On what the trainee actually didAnd why they did thatAnd if they considered anything else at the timeSo, don’t ask “what if” questions like you do in Random Case AnalysisStick to the ‘here and now’ of the caseUse the question maker framework on (click nMRCGP then click CBD)
17 CBD: What’s the Experience So Far? TraineesInitially anxious but less stressful than current SAValued feedbackFound it realisticSome concern re relationship with trainerTrainersTime consuming, need extra protected timeHelpful structureMay be more helpful for difficult traineesConcern re relationship with traineesFrom pilots
18 Consultation Observation Tool (COT) Single consultation per sessionTrainee and Trainer view togetherTrainer assesses consultation on 4pt rating scale (similar to old MRCGP/SA)No rule about consultation lengthIdeally at least one consultation is assessed by someone other than trainerIdeally: wide range of contexts required, including at least one child, older person, mental health problemGPR selects a single consultation per sessionGPR and trainer view togetherTrainer assesses consultation on 7pt rating scale (criteria fairly similar to old MRCGP/SA)Wide range of contexts required, including at least one child, older person, mental health problemNo rule about consultation length (though this may be considered in assessment)Recommended that at least one consultation is assessed by someone other than trainer
19 Why Work Place Based Assessment? What was wrong with the old MRCGP or Summative Assessment?Why Work Place Based Assessment?
22 Miller’s Pyramid or Prism of Clinical Competence Miller’s Prism or Pyramid of Clinical Competence
23 What is Authentic Performance? “Testing should be as close as possible to the situation in which one attacks the problem.”“Ill-structured problems are not found in simulated and/or standardized tests.”“The variation inherent in professional practice will always elude capture by a set of rules.”Wiggins, Assessing Student Performance: Exploring the Purpose and Limits of Testing, Jossey-Bass, Inc. 1993
24 Relationship between tools and competency areas This says something about triangulation. So, not reaching a competency in one form of assessment can be searched for in another. Alternatively, you can validate the measures of a certain competency by comparing results in several assessment methods.
25 Good Assessment Instruments have: Reliability (R)Validity (V)Educational impact (E)Acceptability (A)Cost (C)(Mnemonic: CARVE)Van der Vleuten, The assessment of professional competence: developments, research and practical implications, Adv Health Sci Educ 1 (1996),
26 Why WPBA? High validity = Authenticity High educational impact Reliability = depends on how many you do; also some built in triangulationReconnects assessment with learning and the workplaceAssessment over entire training envelopeCost Effective and now accepted!Cost Effective because you, the trainers are doing it all!
27 And it gives continuous feedback “a process of monitoring student’s progress through an area of learning so that decisions can be made about the best way to facilitate future learning”
28 The Problem With WPBA Inter-observer variation Intra-observer variationCase specificityInter-observer variation – the tendency of one observer to mark consistently higher or lower than another observerIntra-observer variation – the variation in one observer’s marks for no apparent reason (good day/bad day phenomenon)Case specificity – the variation in the candidate’s performance from one challenge to another, even when they seem to test the same attribute
29 Requirements of a high stakes performance assessment SpecificationCalibrationModerationTrainingVerification and audit(Baker, O’Neil, Linn 1991)mmm..... Here are some of the current problems that need sorting outSpecification – of standards, criteria, scoring guidesCalibration – of assessors and moderatorsModeration – of results, particularly those on the borderlineTraining – of assessors including retrainingVerification and Audit – through quality assurance measures and collection of reliability data
30 Rough Guide to Rating Scale Excellent – Smooth and efficient. Able to use knowledge, judgment and skills to adjust management appropriately to the specific patient and operative procedure.Competent – Lacks smoothness and efficiency but is able to use knowledge, judgment and skills to adjust management appropriately to the specific patient and operative procedure.NEEDS FURTHER DEVELOPMENT:Beginner – Lacks smoothness and efficiency. Able to manage the case but exhibits limited use of personal judgment and responsiveness to the specifics of the patient and operative procedure. Requires some limited coaching or attending intervention.Novice – Can only manage the case with extensive coaching and attending intervention.