Presentation is loading. Please wait.

Presentation is loading. Please wait.

Best Evidence Medical Education & Evaluating the Evidence.

Similar presentations

Presentation on theme: "Best Evidence Medical Education & Evaluating the Evidence."— Presentation transcript:

1 Best Evidence Medical Education & Evaluating the Evidence

2 2 Workshop Aim The aim of this workshop is to explore how critical appraisal of research studies is done for non-experimental research, especially in the field of educational evaluation. It will help you to: Gain an overview of approaches to critical appraisal and an appreciation of its role in evidence informed practice and policy making. Identify the challenges educators face judging evaluation designs from a variety of research paradigms, using both quantitative and qualitative data collection methods. Increase your knowledge of the purposes and process of systematic review research in professional education Increase your awareness of the work of the Best Evidence Medical Education Collaboration Consider whether to submit a proposal to do a BEME systematic review or a rapid review.

3 3 09 30 – 09 45Welcome and introductions 09 45 – 10 15Evidence informed education: national, international and professional perspectives Marilyn Hammick 10 15 – 11 00Evidence informed practice in education: argument and evidence Small group activity 1 11 00 – 11 15 11 15 – 11 30 11 30 – 12 00 Refreshments Plenary feedback from activity 1 Appraising and using education research papers in systematic review work All Marilyn Hammick 12 00 - 12 45The reality of critical appraisal –part ASmall group activity 2 12 45 – 13 00Plenary feedback from activity 2 13 00 – 13 45 Lunch 13 45 - 14 30 14 30 - 14 45 The reality of critical appraisal –part B Plenary feedback from activity 3 Small group activity 3 14 45 – 15 00Refreshments 15 00 - 15 30Identifying the need for and using evidence for practice and policy decisions Small group activity 4 15 30 - 15 45Plenary feedback from activity 4All 15 45 – 16 00Take home messages and closeAll

4 4 Evaluating the Evidence Evidence informed practice and policy Critical appraisal of the evidence

5 5 Evidence informed education: national, international and professional perspectives International Campbell Collaboration National UK EPPI-Centre  evidence for policy and practice information Professional Best Evidence Medical Education

6 6 C2 Coordinating Groups Crime and Justice Education Social Welfare Methods Communication and Internationalisation

7 7 Time for evidence based medical education Tomorrow's doctors need informed educators not amateur tutors Stewart Petersen, Professor of medical education. Faculty of Medicine and Biological Sciences, University of Leicester, 1999 Philip Davies Approaches to evidence-based teaching Medical Teacher (2000) 22, 1, pp 14-21 Fredric M Wolf Lessons to be learned from Evidence-based Medicine: practice and promise of Evidence-based Medicine and Evidence-based Education Medical Teacher (2000) 22, 3 pp 251-259 C P M van der Vleuten et al The need for evidence in education Medical Teacher (2000) 22, 3, pp 246-250 John Bligh and M Brownell Anderson Editorial: Medical teachers and evidence Medical Education (2000) 34, 162-163

8 8 Best Evidence Medical Education (2001) Appropriate systematic reviews of medical education Dissemination of information Culture of best evidence medical education

9 9 Taking a BEME approach to educational decisions Comprehensively critically appraising the literature that already exists  systematic  transparent Categorizing the power of the evidence available  realism  epistemological openness Identify the gaps and flaws in the existing literature  published  grey  hand searching Suggest and carry out appropriately planned studies  optimize the evidence  education intervention more evidence based

10 10 BEME (2008) 7 published reviews, 2 in-press Rapid reviews, 3 in-press BEME Spotlights Medical Teacher, BEME Guide, Website Partnership with University of Warwick UK Autumn workshop/Spring Conference Widening the community of practice

11 11 Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning –a BEME systematic review. Med Teach 2005; 27(1): 10-28. Hamdy H, Prasad M, Anderson M B, Scherpbier A, Williams R, Zwierstra R, Cuddihy H. BEME systematic review: Predictive values of measurements obtained in medical schools and future performance in medical practice. Med Teach 2006; 28, 2, pp 103-116. Dornan T, Littlewood, S Margolis A, Scherpbier A, Spencer J, Ypinazar V How can experience in clinical and community settings contribute to early medical education? A BEME systematic review. Med Teach 2006; 28, 1, pp 3-18. Published reviews i

12 12 Published reviews ii Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB Systematic review of the literature on assessment, feedback and physicians’ clinical performance Med Teach 2006; 28, 2, pp 117- 128. Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M and Prideaux D. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education:BEME Guide No 8. Med Teach 2006; 28, 6 pp. 497-526. Hammick M, Freeth D, Koppel I, Reeves S & Barr H (2008) A Best Evidence Systematic Review of Interprofessional Education BEME Guide no. 9 Medical Teacher 29 (8): pp. 735-51. Colthart I, Bagnall G, Evans A, Allbut H, Haig A, Illing J and McKinstry B (2008). The effectiveness of self-assessment on the identification of learner needs, learner activity, and impact on clinical practice: BEME Guide no 10. Medical Teacher 30:2, pp 124-145.

13 13 Reviews in progress … To what extent is the OSCE a valid, reliable, and feasible method of assessing the different learning outcomes in undergraduate medical education? A systematic review of the literature on the effects of portfolios on student learning in undergraduate medical education (peer review) A systematic review on the use of portfolios in postgraduate assessment (peer review) A systematic review of the evidence base around clinical and professional final year assessment in veterinary education

14 14 Assessing the effectiveness and impact of a patient safety curriculum, led by David Mayer, University of Chicago College of Medicine Effectiveness of Journal Clubs, led by Karen Kearley, Oxford University, UK. Skills Loss after Resuscitation Courses, led by Ben Shaw, Liverpool Women’s Hospital, UK. Work-based Assessment in Health Professional Education, led by Jill Thistlewaite, University of Queensland, Australia Educational games for students of health care professions: Elie Akl, Dept of Medicine, State University of New York, Buffalo, USA A review of the evidence linking conditions, processes and outcomes of clinical workplace learning: Tim Dornan et al, Manchester Medical School, UK

15 15 Activity 1: Evidence informed practice in professional education: argument and evidence (45 mins) Task: Critically analyse the following two papers, identifying the strengths and weaknesses of the arguments being made. Paper 1: Hammersley M (2005) Is the evidence –based policy movement doing more good than harm? Reflections on Iain Chalmers case for research based policy making and practice. Evidence & Policy 1: 1, 85-100. Paper 2: Davies P. (2000) The relevance of systematic reviews to educational policy and practice Oxford Review of Education 26: 3&4, 365-378. Feedback 3 key points from your discussion

16 The systematic review examined (Hammick M. A BEME Review: a little illumination. Med Teach. 27(1): 1-4, 2005).

17 17

18 18 Systematic review of the effectiveness of interprofessional education (JET) 10,495 abstracts 884 papers 353 studies 107 ‘robust’ evaluations 21 best evidence studies Medline 1966-2003 CINAHL 1982-2001 BEI 1964-2001 ASSIA 1990-2003 Mainly N.A. (60%) UK = 33%

19 19 Other BEME Review examples 1992 – 2001 BEI, ERIC, Medline, CINAHL & EMBASE 6,981 abstracts 699 papers 73 studies in Review Up to 2001 Medline, Embase, EPOC, ERIC, BEI 20,000 ‘hits’ (titles scanned) 560 papers +44 on update: 2001-4 33 studies in Review

20 20 Abstract Filter, applying inclusion criteria

21 21 Abstract Filter, applying inclusion criteria MAPPING THE FIELD Directions for travel Key requirements Challenges and barriers Equipment for the journey Who should travel this way

22 22 Abstract Filter, applying inclusion criteria MAPPING THE FIELD Learners’ views on the intervention Develop s theory Setting and context of the intervention Macros issues Paints a picture Tells a story Descriptive review - local, - national -International

23 23 Abstract Filter, applying inclusion criteria Evaluation filter What, how, when, who, where? Broad Useful Limited Systematic review - inclusive - general theory supported by some evidence

24 24 Abstract Filter, applying inclusion criteria Evaluation filter What, how, when, who, where? Quality filter Characteristics of effectiveness?

25 25 Abstract Filter, applying inclusion criteria Evaluation filter What, how, when, who, where? Quality filter Characteristics of effectiveness? Focussed Robust Powerful Transferable Systematic review - exclusive - specific theory supported by strong evidence

26 26 Four guiding principles – that an enquiry should be: contributory: advances wider knowledge and/or understanding; defensible in design: provides a research strategy which can address the questions posed; rigorous in conduct: through the systematic and transparent collection, analysis and interpretation of data; credible in claim: offers well-founded and plausible arguments about the significance of the data generated Ref: UK HM Government Strategy Unit

27 27 Quality judgement Contribution Design Conduct Claims

28 28 Contribution Assessment of current knowledge Identified need for knowledge Takes organisational context into account Transferability assessed

29 29 Defensible design Theoretical richness Evaluation question (s) Clarity of aims and purpose Criteria for outcomes and impact Resources Chronology

30 30 Conducted rigorously Ethics and governance Clarity and logic  sampling  data collection  analysis  synthesis  judgements

31 31 Makes credible claims Interpretation Judgement Collection

32 32 Activity 2: Critically appraise the strengths and weaknesses of primary research (45 mins) Task: to evaluate two reports of educational research and discuss their value for evidence informed decision making in professional education using BMJ Guidelines for evaluating papers on educational interventions Paper 3: Crutcher et al. (2004) Multi-professional education in diabetes Medical Teacher, 26: 5, 435–443 Paper 4: Boehler et al. (2006) An investigation of medical student reactions to feedback: a randomised controlled trial Medical Education 40: 746–749 Feedback: value of the 2 studies & utility of the tool

33 33 CASP Tools Study design systematic reviews, randomised controlled trials, qualitative research studies, cohort studies, case control studies, diagnostic test studies, economic evaluation studies htm

34 34 Activity 3: Critically appraise the strengths and weaknesses of primary research (45 mins) Task: to evaluate two reports of educational research and discuss their value for evidence informed decision making in professional education using the UK Government’s Framework for appraising the quality of qualitative evaluations (pp 11- 17) and/or the CASP tool Paper 5: Alderson et al. (2002) Examining ethics in practice: health service professionals evaluations of in-hospital ethics seminars Nursing Ethics 9: 5, 508-521 Paper 6: Bing-You et al. (1997) Feedback falling on deaf ears: residents’ receptivity to feedback tempered by sender credibility Medical Teacher 19: 1, 40-44 Medical Teacher19 Feedback: Issues involved in making judgements about reported research -3 key points

35 35 Activity 4: The need for and use of evidence (30 mins) Task: to discuss and identify the need for and use of evidence by education practitioners and policy makers in local and national contexts. Plenary session:  1 practice and 1 policy area that could be informed by evidence and why  Challenges in using evidence to shape practice and policy

36 To conclude… BEME seminar in your workplace Support & guidance for review groups AMEE 2009, Malaga - BEME sessions Warwick, May 2009 - Portfolio Conference Contact Collect certificates and information Medev evaluation sheet 36

37 Finally … Have a safe journey home Thank you 37

Download ppt "Best Evidence Medical Education & Evaluating the Evidence."

Similar presentations

Ads by Google