Presentation is loading. Please wait.

Presentation is loading. Please wait.

Workplace assessments: moving forward

Similar presentations


Presentation on theme: "Workplace assessments: moving forward"— Presentation transcript:

1 Workplace assessments: moving forward
Karen Mattick Professor of Medical Education Centre for Research in Professional Learning

2 Session outline Background Our research on SLEs
The Foundation Programme and move to “Supervised Learning Events” The importance of feedback Formative and summative assessment Our research on SLEs How SLEs can be improved Questions and discussion

3 Medical training under MMC (Delamothe, 2008, p54).
Medical training in the UK was restructured in 2008 with the introduction of Modernising Medical Careers (MMC) (Department of Health, 2004). New doctors follow a pathway of training (The Next Steps, 2004): the two-year Foundation Programme (FY1 and FY2), where FY1 is guaranteed and full registration with the General Medical Council takes place following successful completion of FY1; two years of specialist training years 1 and 2 (to make up three years of core specialist training), followed by higher specialist training, successful completion of which leads to a certificate of completion of training confirming readiness for independent practice in that specialty at consultant level. The length of training for general practice was extended to five years—three years of core training plus two years as a general practitioner specialist registrar—bringing it in line with training in other developed European countries. A shift to outcomes-based learning, on defining a clear set of outcomes and assessing competencies, and performance on those outcomes, was identified as appropriate to address at least some of these concerns (PMETB, 2005). Competence can be defined as the ability to do something successfully, that is, the ability to perform a given task (e.g., Gilbert, 1978). Competency-based training describes progression referenced to the demonstrated ability to perform certain tasks. The shift from the apprenticeship model to outcomes-based training with MMC addressed the issue of reduced training time by focusing on progression through training being via the acquisition of competencies, rather than time spent in training and clinical practice1.

4 Foundation programme WPBAs
Aim to maximise learning, given various constraints on training time, including European Working Time Directive Outcomes-based learning assessing competences Iterative development and refinement of WPBA tools Direct Observation of Procedural Skills (DOPS) Mini Clinical Examination (Mini-CEX) Case Based Discussion (CBD) Certain criteria from van der Vleuten’s utility equation (Educational x validity x reliability x cost x acceptability) have been examined in some detail in terms of WPBA instruments (e.g., their reliability, validity, and acceptability, see Norcini and Burch, 2007; Nair et al. 2008, Weller et al., 2009a, 2009b). The focus of research has been on the instruments for work-based assessments in single clinical encounters (e.g., Mini-CEX, DOPS). Trainee and observer attitudes about the tools have been the most commonly measured outcomes (see recent review by Kogan et al., 2009; see also Ryland et al., 2006; Weller at al., 2009b; Wilkinson et al., 2008) and research indicates that WPBA tools are generally acceptable in terms of the perceptions of users (trainers and trainees) (but see also Collins, 2010; Overmeem et al., 2009; Pereira and Dean, 2009; Sargeant et al., 2008 for studies identifying less positive attitudes including those of UK trainees). A recent review by Pelgrim et al. (2010) usefully summarised that the feasibility of instruments seems to be good, as is their reliability with sufficient encounters. The validity of some instruments has not been investigated but the validity of the Mini-CEX is supported by strong and significant correlations with other valid assessment instruments. However, other criteria have been neglected, particularly educational impact. Recent systematic reviews of the impact of WPBA on doctors’ education and performance (Miller and Archer, 2010; Pelgrim et al., 2010) concluded that there are few published articles on the educational impact of WPBAs and there is no evidence that WPBA tools lead to improvement in performance. Reasons for this are likely to include the sub-optimal use of these tools for feedback (Fernando et al., 2008; Holmboe et al., 2004; Canavan et al., 2010); and other medical education findings indicate that sub-optimal learners are less likely to seek feedback (e.g., Sinclair and Cleland, 2007). Outcomes such as learning, transfer of skills to new situations, or improved patient care are relatively unstudied - and when they are, the conclusions which can be drawn are limited due to weak study designs. In fact, the strongest evidence for WPBA improving performance comes from studies examining multisource feedback (Miller and Archer, 2010), now replaced with TAB.

5 Van der Vleuten’s utility ‘equation’
Utility = reliability x validity x ed impact x cost x acceptability Acceptability of FP WPBAs generally good Reliability good with sufficient encounters Validity unclear but Mini-CEX correlates with other measures Educational impact unclear but little evidence that WPBAs lead to improved performance (e.g. Miller & Archer, 2010) because: little research on this sub-optimal use of tools for feedback sub-optimal learners less likely to seek feedback Van der Vleuten 1996

6 Collins Review of FP, 2010 ? FP meeting its original objectives
Mixed findings concerning the utility of workplace-based assessment in postgraduate medical education (Norcini & Burch 2007, Wilkinson et al. 2008, Miller & Archer 2010) Assessment of trainees was excessive, onerous and not valued Range/frequency of assessment tools should be reviewed

7 Change to Supervised Learning Events, 2012
Highlight achievements and areas of excellence Provide immediate feedback and suggest areas for further development Demonstrate engagement in the educational process A move from summative to formative assessment, with the emphasis on feedback Academy of medical royal colleges

8 Formative vs summative assessment
Aim is to enhance learning through providing feedback Aim is to provide data for selection (e.g. next stage of education, employment) Indicates what is good and why this is good; and how things could be improved Quantifies and rewards achievement in meeting assessed criteria Integral part of teaching and learning Normally at the end of a block of learning Affects what the student and the teacher does next. Can provide information that has formative/diagnostic value Doesn’t contribute to final mark, hence focus on educational impact. Contributes to final mark, hence validity and reliability are important

9

10 Feedback in Foundation Doctors
“I don't think there’s too much time for feedback. I think if you did something terrible I am sure they would let you know ... You know, after a few months you learn that you know, no news is good news erm.” Male F1, Location 1. “We don't actually discuss “oh this is this clinical scenario so why do you think we did this, why do you think we did that”. I think we would gain a lot if that happened just you know some explanation as to why this happened and also quizzing why do you think this can’t be done or why do you think this can be done.” Male F1, Location 2. “Formal teaching is fine but.. we don't have any more time to come off the wards to go to formal teaching… I feel as though there isn’t enough of an awareness of the need of FY doctors to be continually learning”. Female F2, Location 1.

11 AoMRC funded evaluation of SLEs compared with previous WPBAs one year into their implementation

12 Method 110 foundation trainees & trainers interviewed across England, Scotland & Wales Maximum variation sampling, hospital & GP Narrative interviews understandings of SLEs & WPBAs and differences (if any) experiences of SLEs & WPBAs how SLEs should be developed Qualitative and quantitative thematic and discourse analysis; narrative analysis

13 Difference between SLEs & WPBA unclear (in 2013 at least)
“ We’re using exactly the same as we were last year- um just under a different title which is SLEs so you know I’m a little bit puzzled as to- um as to what the difference actually is.” (Male F2, site 3) “I think they’re identical and with the exception with the WPBA’s you can kind of fail someone if you like but the SLE’s are- um a formative rather than summative but for all intents and purposes from an operational perspective of the way I conduct them I don’t think there’s any difference” (Male Trainer, site 3)

14 SLEs conceptualised in diverse ways
Formative vs summative tool “It is a learning event and you should be giving them feedback on the process there and then, and that should be used as a learning tool, um as well” (Female Trainer, site 2) “part of it's almost like a safety net to catch people who aren't competent I think and to make sure that's highlighted to more senior doctors.” (Female F2, site 1) Tick box exercise, formalising “what we do anyway” “I think that’s just formalising what we do normally- ward round teaching it’s formalising that but also making it more time consuming because you have to write it all down” (Female Trainer, site 1) One off vs developmental “Problem is it’s just, the supervised learning events is just a one off thing, it’s just like a little snap shot” (Female F1, site 2)

15 But SLEs possibly evaluated more positively
Of a sub-sample of interviews, including 211 narratives, SLEs were more than twice as likely to be evaluated positively overall (n=100; 47%) rather than negatively (n=44; 21%).

16 Despite some ‘process’ problems
Initiation trainees strategically choosing particular trainers not typically trainer initiated Learning event itself trainees receiving sub-optimal feedback forms that don’t lend themselves to formative feedback Finalisation chasing trainers to complete forms

17 Improving SLEs – next steps
Improve trainees and trainers’ understanding of formative assessments (SLEs) Improve trainer-trainee relationship e.g. to promote longitudinal feedback Improve culture of workplace learning Improve tools e.g. forms that encourage free text, systems that aid completion Develop, assess & recognise trainers Common suggestions for improvements included improving trainee and trainers’ understandings about SLEs through education (individual); improving the trainee-trainer relationship through regular meetings longitudinally and trainees also providing trainers with feedback in order to close the ‘feedback loop’ (interpersonal); further movement towards SLEs being formative and trainers becoming more involved in their initiation (cultural); and making the forms more formative through more free-text comments and removing key barriers to their completion such as allowing the completion of paper-based forms (technological). Individual e.g. education Interpersonal e.g. more regular meetings, longitudinal, trainees provide trainers with feedback Cultural e.g. clearer aim of formative SLEs, trainers more involved in initiation Technological e.g. changes to forms to encourage free text, remove barriers to completion

18 References Academy of Medical Royal Colleges (2012) The UK Foundation Programme Curriculum, July 2012http:// 2012 Collins JP (2010) Foundation for Excellence: An Evaluation of the Foundation Programme. Medical Education England Norcini J & Burch V (2007) Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher 2007;29: Wilkinson JR, et al. (2008) Implementing workplace-based assessment across the medical specialties in the United Kingdom. Medical Education 2008;42: Miller A & Archer J (2010) Impact of workplace based assessment on doctors’ education and performance: a systematic review. British Medical Journal 341, c5064.


Download ppt "Workplace assessments: moving forward"

Similar presentations


Ads by Google