Presentation is loading. Please wait.

Presentation is loading. Please wait.

Postgraduate Medical Training – Evaluation and Audit Copenhagen Nov 2013 Professor Wendy Reid Medical Director Health Education England Past- Vice-President,

Similar presentations


Presentation on theme: "Postgraduate Medical Training – Evaluation and Audit Copenhagen Nov 2013 Professor Wendy Reid Medical Director Health Education England Past- Vice-President,"— Presentation transcript:

1 Postgraduate Medical Training – Evaluation and Audit Copenhagen Nov 2013 Professor Wendy Reid Medical Director Health Education England Past- Vice-President, Education, RCOG © Royal College of Obstetricians and Gynaecologists

2 UK Specialty Training & Education Programme

3 A model of clinical competence Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67. Knows Shows how Knows how Does Professional authenticity Cognition = knowledge Behaviour = Skills + attitude

4 UK Specialist training programme Basic – years 1&2, part 1 MRCOG Intermediate – years 3,4&5, part 2 MRCOG Advanced – years 6&7, requires 2 ATSMs minimum, career development and ‘independent’ competencies 19 core modules, subject based, includes professional skills and leadership

5 Basic Training Exposure to the specialty Basic emergency obstetric and gynaecology skills Understanding role of the doctor Team work – multi professional, develop leadership skills Pass Part 1 MRCOG

6 Intermediate training Builds on basic skills Leadership – clinical, administrative Competences for normal practice i.e. day to day obstetrics, emergency gynae and core gynae skills Pass Part 2 MRCOG Workplace-based assessments More clinical responsibility, labour ward leadership and acute gynaecology, develop interests and choose advanced modules

7 Advanced training Core continues throughout programme! Advanced Training Skills Modules (minimum x2) Designed to produce a workforce for the service and give individuals scope to develop clinical expertise in specific area New ATSMs in development, some academic, some ‘professional’

8 Advanced training skills modules (ATSMs) Fetal Medicine Benign Vaginal Surgery Advanced Labour Ward Practice Advanced Lap surgery for the excision of benign disease Benign Gynaecological Surgery: Laparoscopy Labour Ward Lead Benign Gynaecological Surgery: Hysteroscopy Maternal Medicine Colposcopy Advanced Antenatal Practice Vulval Disease Acute Gynaecology and Early Pregnancy Abortion Care Gynaecological Oncology Sexual Health Subfertility and Reproductive Endocrinology Menopause Urogynaecology Paediatric and Adolescent Gynaecology Benign Abdominal Surgery Medical Education Domestic violence

9 Workplace Based Assessments All trainee grades in UK Varied names but similar principles Ongoing challenge of ‘formative vs summative’ Monitoring through Royal Colleges/Faculties

10 Testing formats Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67. Knows Shows how Knows how Does Professional authenticity Written/ Computer based assessment Performance/ hands on assessment

11 Drivers for WBA New curricula – trainees need to prove ‘competence’ GMC- the regulator ( and the public) want explicit evidence of competence Professional examinations do not test ‘real life’ skills and performance Learning from other systems One way of evaluating quality of training

12 UK experience of WBA Began with Foundation Programme (years 1&” after graduation) Launched 2005, integrated assessment process Regardless of post or geography Outcomes collated by Sheffield University Each training area (Deanery) informed of ‘outliers’ Large cohort Centralised faculty training

13 Specialty training From end of F2 to CCT New curricula, launched August 2007 Assessment tools based on FP Many ‘in development’ and specialty specific Trainees in mixed programmes, mostly using log books to capture evidence of progress Most curricula mandate ‘minimum numbers of assessments’ Summarised annually in ARCP (previously RITA)

14 Challenges of WBA in Specialty Training Does it really measure the doctors? Are we sure we are measuring the right things? How often do they need to be done? Are they a good measure of continued competence? How do we involve patients? How do we ensure trainers are trained and have the time to do WBAs properly? To provide QA takes large numbers – poor reliability

15 Other tools for QA of Training Longitudinal analysis of MRCOG results – cohort comparison, demographic data required Trainee doctor ‘user’ surveys Trainee feedback at end of training episodes Population wide survey of trainee doctors by the GMC

16 Whole QA system

17 Formal requirements in UK for Training QA Royal College annual report to GMC – specialty specific Deanery (regional) annual report of Education and Training – all specialties Trainees must complete the Annual GMC survey All curriculum changes assessed by GMC All examination changes and examination data submitted to the GMC

18 Whole QA system “The GMC expects medical schools and deaneries to demonstrate compliance with the standards and requirements that it sets. To do this, they will need to work in close partnership with the medical Royal Colleges and Faculties, NHS trusts and health boards and other LEPs. This means that QM should be seen as a partnership between those organisations because it is only through working together that medical schools, deaneries, Royal Colleges and Faculties, with LEPs, can deliver medical education and training to the standards required.” (GMC Quality Improvement Framework, para. 29)

19 Whole QA system “The GMC quality assures medical education and training through the medical schools and deaneries but day-to-day delivery is at LEP level. This delivery involves medical staff, medical education managers, undergraduate and postgraduate medical centre staff, other health professions and employers. Clinical placements, student assistantships, individual foundation programme and specialty including GP training are delivered through careful supervision and assessment by specialists in the relevant discipline advised and overseen by regional and local staff from the UKFPO, the Academy of Medical Royal Colleges and the relevant medical Royal College or Faculty.” (GMC Quality Improvement Framework, para. 46)

20 Role of medical royal colleges Set curriculum and workplace-based assessments for trainee doctors according to GMC standards Set criteria for progression between stages of training Engage with a range of stakeholders to assure quality of training, particularly 16 UK deaneries Provide fora for making policy, sharing best practice and developing training requirements as clinical practice develops Provide specialist faculty development Assure the quality of individual trainees (recommendation for CCT/CESR, MRCOG)

21 Role of medical royal colleges Colleges can also raise concerns about patient or trainee safety directly with the GMC or CQC Colleges work together on national medical education policy through Academy

22 Governance College committees agree national policy on various aspects of specialty education (e.g. exams, curriculum, ARCP) Network of College Tutors coordinate training in individual Trusts Specialist educational management and leadership roles created in Colleges (e.g. committee chair) Heads of Deanery Specialty Schools jointly appointed with Colleges Colleges report to GMC via Annual Report

23 QA processes ARCP – colleges send specialist assessor to assure deanery process for progressing trainees Quality visits – colleges provide specialist assessor on request to join deanery visit team CCT/CESR(CP) and CESR – recommendation of individual doctor to GMC for inclusion on specialist register Examination – standard-setting Curriculum approval – changes to curricula approved by GMC

24 Data on quality GMC Trainee Survey ARCP outcome data – summary of achievements annually for every trainee Examination data Colleges’ own surveys (e.g. Training Evaluation Form) Reports from external assessors on local/regional QA processes (ARCP and quality visits) Increasingly linked with quality of care and patient safety reviews

25 GMC Trainees’ survey – O&G Perspectives Three specific elements - o How O&G trainees compare with other specialties. o How the results from this year for specific questions compare with those in previous years (looking at the areas previously considered). o Specialty Specific Questions Total number of trainees responding 49000 (95%)

26 Trainee Evaluation Forms Not mandatory Might work effectively if based on MSF ‘360’ feedback Should be real-time tool for local training quality management Best discriminator is ‘would you recommend this job to a friend?’

27 Programme Groups NationalThis Report Programme Group MeanMinQ1MedianQ3MaxLower CIUpper CINMeanLower CIUpper CIN ACCS 79.662472809210078.7580.57111479.6678.7580.571114 Acute Internal Medicine 81.722076849210081.3282.12476677.675.9779.24302 Allergy 81.722076849210081.3282.12476682.874.790.910 Anaesthetics 82.682076849210082.1683.2240982.5882.0583.112358 Anaesthetics F1 75.462068768410075.1275.79707789.9288.591.33198 Anaesthetics F2 78.672072808810078.3379.01713887.7986.3289.27232 Audio vestibular medicine 81.722076849210081.3282.12476681.674.1189.0915 Cardiology 81.722076849210081.3282.1247668179.7582.26550 Cardio-thoracic surgery 83.672076849610083.1984.14351482.7879.2886.2895 Chemical pathology 84.932080849610083.9385.9267280.4576.8184.0962 Child and adolescent psychiatry 86.462080889610085.7587.1812328785.3288.67211 Clinical genetics 81.722076849210081.3282.12476686.7883.8489.7346 Clinical neurophysiology 81.722076849210081.3282.12476685.2280.6689.7723 Clinical oncology 84.532076849610083.8585.2133282.380.7983.81285 Clinical pharmacology and therapeutics 81.722076849210081.3282.12476678.8671.586.2221 Clinical radiology 84.532076849610083.8585.2133285.1384.3885.891047 CMT 74.55206476841007475.11273074.557475.112730 Community Sexual and Reproductive Health 78.592068808810077.9379.25191578.4671.585.4313 Core Anaesthetics 85.282080889610084.4886.08105285.2884.4886.081052 CPT 81.772476849210081.0682.47152981.7781.0682.471529 CST 74.522064768810073.6775.38146374.5273.6775.381463 Dermatology 81.722076849210081.3282.12476684.2982.3886.21191 Emergency medicine 80.152872809210078.9881.3255080.1578.9881.32550 Emergency Medicine F1 75.462068768410075.1275.79707787.8186.0589.57169 Emergency Medicine F2 78.672072808810078.3379.01713882.4581.7583.141199

28 O&G Programme Group Comparison

29 Supervision (1) How would you rate the quality of (clinical) supervision in this post?  Did you have a designated educational supervisor (the person responsible for your appraisal) in this post?  Yes  2012 – 99.3% (2011 – 99.5%, 2010 – 99.5%, 2009 – 99.8%)  In this post did you have a training/learning agreement with your educational supervisor, setting out your respective responsibilities?  Yes  2012 – 86.4% (2011 – 91.9%, 2010 – 92.6%, 2009 – 91.1%)  In this post did you use a learning portfolio?  Yes  2012 – 92.4% (2011 – 94.7%, 2010 – 89.9%, 2009 – 91.2%)  In this post were you told who to talk to in confidence if you had concerns, personal or educational?  Yes  2012 – 71.2% (2011 – 77.7%, 2010 – 72.2%, 2009 – 68.8%)

30 Supervision (2) Did you have a formal meeting with your supervisor to talk about your progress in this post? Did you have a formal assessment of your performance in the workplace in this post?

31 Access to Training (1) How would you rate the practical experience you were receiving in this post? In this post, how often have you worked beyond your rostered hours?

32 Access to Training (2) How confident are you that this post will help you acquire the competencies you needed at that particular stage of your training? How good or poor was access to each of the following in your post? (2012 question only)

33 Working Beyond Competence (1) In this post how often did you feel forced to cope with clinical problems beyond your competence or experience? In this post how often, if ever, were you supervised by someone who you felt wasn't competent to do so? In this post how often have you been expected to obtain consent for procedures where you feel you do not understand the proposed interventions and its risks?

34 Working Beyond Competence (2) In this post did you always know who was providing your clinical supervision when you were working? (2009 – 2011 inclusive) In this post did you always know who your available senior support was during on call (2012)

35 Undermining – 2012 (1)

36 Undermining 2012 (2) Overall 96.0% of trainees said they had never been bullied and/or harassed in their post, or if they had, it happened less than once a month. 1.1% said it happened every day or at least once per week (n=48,512). 1.6% said they had witnessed someone else being the victim of bullying and/or harassment in their post every day or at least once per week (n=48,464). 92.4% said they had never experienced behaviour from a consultant or GP that undermined their professional confidence and/or self-esteem or, if they had, it happened less than once a month. 1.7% said it happened every day or at least once per week (n=48,785). O&G The equivalent figures for O&G are 83.33% and 2.37%. The equivalent figure for O&G is 3.05%. The equivalent figures for O&G are 80.96% and 2.53%. (1902 respondents for O&G)

37 O&G versus other specialties

38 Next Steps Specialty specific questions to be further analysed Breakdown by training level may also be available – need to discuss with GMC. Will be involved with preparation for 2014 survey. Must publish more quickly An updated trainers survey has been discussed – believe something may be in place within 12 months. TEF – a potential method of triangulation? RCOG has appointed Workplace Advisory Officer to combat undermining

39 Who does what in governance of training? Education Board -RCOG Heads of Schools – joint between college and local regional Postgraduate Dean Local Education and Training Boards (Deaneries) Individual hospitals (Local Education Providers, LEPs) – DMEs or Clinical Tutors Individual doctors through trusts or organisations

40 Future developments Outcome of review of QA system by GMC in Autumn 2013 (note: GMC became regulator in 2010 for PG medical education) Growing recognition of need to clarify role of colleges in QA Increased focus on sharing data between deaneries, colleges and GMC Increased emphasis on the role of educational and clinical supervisors/trainers with consequent impact on service delivery Impact of national policy changes, e.g. Shape of Training, new English healthcare structure Role of HEE, relationship with Colleges, GMC, devolved nations


Download ppt "Postgraduate Medical Training – Evaluation and Audit Copenhagen Nov 2013 Professor Wendy Reid Medical Director Health Education England Past- Vice-President,"

Similar presentations


Ads by Google