Standardizing Learner Surveys Across the Enterprise Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous.

Slides:



Advertisements
Similar presentations
UCSF School of Medicine OCME Program Frequently Asked Questions Faculty Disclosure and Resolution of Conflict of Interest.
Advertisements

Rheumatoid Arthritis in Practice An Expert Commentary With Chaim Putterman, MD A Clinical Context Report.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Engaging Online Faculty and Administrators in the Assessment Process at the American Public University System Assessment and Student Learning: Direct and.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
American College of Cardiology Transforming Science into Quality Care Alfred A. Bove, MD, PhD, FACC.
Evaluating Competency Based Education in Clinical and Translational Science Wishwa Kapoor, MD, MPH.
Managing Conflicts of Interest © ACCME.
“ Linking Blood Pressure and Cardiovascular Health” Welcome We invite you to explore what a membership in the American Society of Hypertension, Inc. (ASH)
Veronica Wilbur PhD APRN College of Health Professions Chair Nurse Practitioner Program.
Reinventing Medical Education Marissa Seligman, PharmD VP & Compliance Officer Pri-Med Institute, Boston, MA June 7, 2005.
What do I need to do to comply with the Disclosure Policy? Every planner, course director, CME associate, speaker, moderator, panelist or anyone else involved.
2013 Joint KMS and MSMA CME Provider Conference. The Provider describes that its CME Committee sets the topics and educational goals of its CME activities.
Authors: Dr Craig M. Campbell Dr Marianne Xhignesse Dr. Ted Toffelmire September 20, 2012.
UVM CME Policy Meeting Cheung Wong, MD May 7, 2013.
Standardized Discharge Summary Template Project Mary Shanahan, Senior Manager Dr John Edmonds, Clinical Director Medical Informatics.
Implementation of the MedBiquitous Curriculum Inventory Standardized Vocabulary Robby Reynolds Sr. Director, Medical Education Online Resources, AAMC Terri.
Geriatric Education Programs and Resources from the Donald W. Reynolds Foundation Grants in Aging and Quality of Life Rosanne M. Leipzig, MD, PhD Mount.
Exploring the Accreditation Criterion 1-15 BEST PRACTICES.
Medical Education Metrics 2.0 (MEMS 2.0) Orientation.
Standards for Commercial Support: Standards to Ensure Independence in CME Activities SM Case Studies.
® Realizing Collaboration: A MedBiquitous Update Peter Greene, MD Executive Director, MedBiquitous May 14, 2008.
Workforce Development in Collaborative and Integrated Care across the Health Professions: The Social Work Perspective Stacy Collins, MSW National Association.
American College of Cardiology Transforming Science into Quality Care.
Individualizing Care in HIV: A Critical Review of Treatment Options Presented by The Johns Hopkins University School of Medicine Produced in collaboration.
Leadership Development Institute February 15, 2013.
Collaboration Challenges: Inter- Organizational Guideline Forum (IOGF) Craig W. Robbins, MD, MPH KP Care Management Institute-Medical Director, Center.
Sarah Sabol. PARS Who: ACCME What: Program and Activity Reporting System When: Annually, by March 31 Why?: Helps ACCME and providers demonstrate the size,
Immunization Data Exchange (BYIM v 2.0*1) Transporting the Message to the IIS Nathan Bunker & John Parker Updated 08/05/2011.
Healthcare Learning Object Metadata Status: Candidate standard The opportunity For organizations that create a lot of learning content and activities,
Title of Presentation Speaker Names, Credentials, Full Title Collaborative Family Healthcare Association 12 th Annual Conference October 21-23, 2010 Louisville,
Jointly Sponsored by The Warren Alpert Medical School of Brown University and the New England Regional Chapter of the Society of Adolescent Health and.
1 Content Collaboration Across Multiple Systems How The American College Of Cardiology Is Creating A Unified Approach Towards Lifelong Learning Wednesday.
Disclosure: Dr. Kues and Mr. Kelly are both members of the Board of Directors for NCCME Jack Kues, Ph.D, CCMEP University of Cincinnati Laird Kelly, BS,
AAMC Council of Faculty and Academic Societies (CFAS) Pamela N Peterson, MD MSPH Associate Professor of Medicine Kevin Lillehei, MD Professor and Chair,
Rounds and Rounds We Go: A New Era in the Delivery of Weekly Grand Rounds Jose Francois MD MMedEd CCFP FCFP.
This activity is sponsored by The Center for Health Care Education, LLC (CHCE) –Maximum of 1.75 category 1 credits toward the AMA Physician’s Recognition.
+ National and Institutional Guidelines on Conflict of Interest in Physician-Industry Relationships.
® Laying the Groundwork for Collaboration: A MedBiquitous Update Peter Greene, MD Executive Director, MedBiquitous April 29, 2008.
Disclosure of Financial Conflicts of Interest in Continuing Medical Education Michael D. Jibson, MD, PhD and Jennifer Seibert, MD University of Michigan.
Program Co-Development in CME: Where have we been? Where are we going? Workshop Facilitators: Dr. Craig Campbell Dr. Jamie Meuser September 21,
® MedBiquitous Annual Conference 2012 Don Detmer, MD, MA Chairman, Board of Directors.
® Advancing Health Education: A MedBiquitous Update Peter S. Greene, MD Executive Director, MedBiquitous CMIO, Johns Hopkins Medicine MedBiquitous Annual.
ACMA Mission ACMA Mission: To be THE association for Hospital / Health System Case Management professionals.
® Building the Infrastructure for Continuous Improvement: A MedBiquitous Update Peter Greene, MD CMIO, Johns Hopkins Medicine Executive Director, MedBiquitous.
Sara Lovell, CPCS Education Coordinator Providence Alaska Medical Center.
ACCME at the International Pharmaceutical Compliance Summit Philadelphia March 2005.
Jointly sponsored by Vanderbilt University School of Medicine and the Society for Pediatric Sedation.
Jointly Sponsored by The Warren Alpert Medical School of Brown University and the New England Regional Chapter of the Society of Adolescent Health and.
Formative Evaluation of the Implementation of the Medical Education Metrics Standard for Continuing Education MedBiquitous Annual Conference 2011 May 11,
Formative Evaluation of the Implementation of the Medical Education Metrics Standard for Continuing Education MedBiquitous Annual Conference 2011 May 11,
Healthcare Learning Object Metadata Status: Candidate standard The opportunity For organizations that create a lot of learning content and activities,
From Cradle to Grave: How Vital is Vital Registration in the Next Decade, & for Whom? Thursday, January 20, :00 – 4:30 pm (EASTERN) Featuring Co-Presenters:
OU Neurology THE TITLE OF MY TALK John Doe, M.D. Professor Department of Neurology The University of Oklahoma Health Sciences Center.
CME Process and Accreditation Criteria Objectives Explain Updated ACCME Accreditation Criteria Utilize the revised CME Planning Process to incorporate.
Rheumatology Mastery in Ankylosing Spondylitis
David L. Bell, MD, MPH Assistant Clinical Professor of Pediatrics
Contraception Cases Michelle M. Forcier, MD
Curriculum Analysis Tool
MedBiquitous Standards Implementation Workshop
Learning Session 2 Welcome Back! – DAY TWO
Pulmonary Rehabilitation and Readmission
Peter S. Greene, M.D., Executive Director May 16, 2016
Data Standards Provide a Focus for CME Initiatives
Heart Failure Management
Evaluating Online Courses and the Challenges Associated
Can Mobile Technology Improve Outcomes?
CME and Consultants Compliance Roundtable
An Introduction to the ACGME
THE TITLE OF MY TALK John Doe, M.D. Professor Department of Neurology
Presentation transcript:

Standardizing Learner Surveys Across the Enterprise Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous

Disclosure We have no financial relationships to disclose.

Objectives At the completion of this session, you will be able to: Adopt strategies to improve the collection of consistent evaluation data from learners Adopt strategies to improve the collection of consistent evaluation data from learners Adopt strategies to improve the analysis of evaluation data across the CME enterprise Adopt strategies to improve the analysis of evaluation data across the CME enterprise

Overview 1. Challenges in analyzing learner surveys 2. MedBiquitous and MEMS 3. RSNA’s Implementation of a Standardized Survey 4. Results of RSNA course evaluation 5. Challenges faced by RSNA 6. Key strategies for improving data collection and analysis

Challenges in Analyzing Learner Surveys Most of us use surveys Most of us use surveys Surveys often differ based on activity Surveys often differ based on activity Survey data may be in different systems or formats Survey data may be in different systems or formats The result: it’s hard to analyze results across activities The result: it’s hard to analyze results across activities

RSNA Radiological Society of North America Radiological Society of North America “to promote and develop the highest standards of radiology and related sciences through education and research” “to promote and develop the highest standards of radiology and related sciences through education and research” Over 40,000 members Over 40,000 members Online and in-person CME activities Online and in-person CME activities Member of MedBiquitous Member of MedBiquitous Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group

MedBiquitous Technology standards developer for healthcare education Technology standards developer for healthcare education ANSI Accredited ANSI Accredited Develops open XML standards Develops open XML standards 60 members (societies, universities, government, industry) 60 members (societies, universities, government, industry) 7 working groups 7 working groups

The Focus on Metrics “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” Medical Education Metrics – MEMS Medical Education Metrics – MEMS Ross Martin, MD, Director Healthcare Informatics Group, Pfizer

Another Perspective “I need this to better understand how my program as a whole is doing.” “I need this to better understand how my program as a whole is doing.” Nancy Davis, American Academy of Family Physicians

The MedBiquitous Metrics Working Group Mission: Mission: to develop XML standards … for the exchange of aggregate evaluation data and other key metrics for health professions education. Originally a subcommittee of the Education Working Group Originally a subcommittee of the Education Working Group Became a working group in April 2005 Became a working group in April 2005 We’re all using the same measuring stick… --Francis

Who is Involved? Francis Kwakwa, RSNA, Chair Francis Kwakwa, RSNA, Chair Linda Casebeer, Outcomes Inc. Linda Casebeer, Outcomes Inc. Nancy Davis, AAFP Nancy Davis, AAFP Michael Fordis, Baylor College of Medicine Michael Fordis, Baylor College of Medicine Stuart Gilman, Department of Veterans Affairs Stuart Gilman, Department of Veterans Affairs Edward Kennedy, ACCME * Edward Kennedy, ACCME * Jack Kues, University of Cincinnati Tao Le, Johns Hopkins University Ross Martin, Pfizer Jackie Mayhew, Pfizer Mellie Pouwels, RSNA Andy Rabin, CE City Donna Schoonover, Department of Veterans Affairs * Invited experts

What’s in MEMS Participation Metrics Participation Metrics how many participants how many participants Learner Demographics Learner Demographics profession, specialty profession, specialty Activity Description Activity Description name, type name, type Participant Activity Evaluation Participant Activity Evaluation survey results survey results Other types of evaluation metrics planned for future versions Other types of evaluation metrics planned for future versions

For more information: Metrics Working Group Page cs/index.html Metrics Working Group Page cs/index.html cs/index.html cs/index.html MedBiquitous Website MedBiquitous Website

Discussion Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey? Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey?

RSNA’s Project… Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System Currently MEMS is used to evaluate over 300 online courses Currently MEMS is used to evaluate over 300 online courses

RSNA’s Project… Types of online courses using MEMS Cases of the Day (COD) Cases of the Day (COD) RadioGraphics CME Tests/Education Exhibits (EE) RadioGraphics CME Tests/Education Exhibits (EE) Refresher Courses (RSP) Refresher Courses (RSP)

Results…

COD-45 (N = 24) The course achieved its learning objectives

EE-355 (N = 32) The course achieved its learning objectives

RSP-2904 (N = 43) The course achieved its learning objectives

COD-45 (N = 24) The course was relevant to my clinical learning needs

EE-355 (N = 32) The course was relevant to my clinical learning needs

RSP-2904 (N = 43) The course was relevant to my clinical learning needs

COD-45 (N = 24) The course was relevant to my personal learning needs

EE-355 (N = 32) The course was relevant to my personal learning needs

RSP-2904 (N = 43) The course was relevant to my personal learning needs

COD-45 (N = 24) The online method of instruction was conducive to learning

EE-355 (N = 32) The online method of instruction was conducive to learning

RSP-2904 (N = 43) The online method of instruction was conducive to learning

COD-45 (N = 24) The course validated my current practice

EE-355 (N = 32) The course validated my current practice

RSP-2904 (N = 43) The course validated my current practice

COD-45 (N = 24) I plan to change my practice based on what I learned in the course

EE-355 (N = 32) I plan to change my practice based on what I learned in the course

RSP-2904 (N = 43) I plan to change my practice based on what I learned in the course

COD-45 (N = 24) The faculty provided sufficient evidence to support the content presented

EE-355 (N = 32) The faculty provided sufficient evidence to support the content presented

RSP-2904 (N = 43) The faculty provided sufficient evidence to support the content presented

COD-45 (N = 24) Was the course free of commercial bias towards a particular product or company?

EE-355 (N = 32) Was the course free of commercial bias towards a particular product or company?

RSP-2904 (N = 43) Was the course free of commercial bias towards a particular product or company?

COD-45 (N = 24) Did the course present a balanced view of clinical options?

EE-355 (N = 32) Did the course present a balanced view of clinical options?

RSP-2904 (N = 43) Did the course present a balanced view of clinical options?

Group Discussion What challenges to survey data collection and analysis have you faced? What challenges to survey data collection and analysis have you faced?

Challenges Faced by RSNA Survey is optional; little data available for some courses Survey is optional; little data available for some courses Little variation in the data Little variation in the data Some disconnect with educators on how the data is used Some disconnect with educators on how the data is used Difficult to get data out of the LMS Difficult to get data out of the LMS Surveys for live events are not included Surveys for live events are not included

Key Strategies Data Collection Data Collection Common core set of survey questions Common core set of survey questions Common format for evaluation data Common format for evaluation data Data Analysis Data Analysis Compare within type and modality Compare within type and modality Compare across type and modality Compare across type and modality Look for trends and variation Look for trends and variation Look for red flags Look for red flags

An Added Benefit Assists with program analysis and improvement required by the ACCME “The provider gathers data or information and conducts a program-based analysis on the degree to which the CME mission of the provider has been met through the conduct of CME activities/educational interventions.” -- ACCME Updated Accreditation Criteria, September 2006