Creation of the Simulator Value Index Tool Adapted from workshop on 4.21.14 presented by American College of Surgeons Accreditation Education Institutes,

Slides:



Advertisements
Similar presentations
Developing online Learning Dr Derek France Department of Geography Chester College of H.E. GEES.
Advertisements

Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
National Accessible Reading Assessment Projects Goals of Project NARAP Collaboration General Advisory Committee Project Details (ETS and PARA) Plans for.
Process, Truth and Consequence informal reflections on games, game communities and their use within Higher Education Michael Begg, David Dewhurst, Rachel.
Course Goal To improve the TB contact investigation interviewing skills of health care workers 2.
Culture Change: What IT Takes to Create a Quality Customer Service Environment Presented By: Anne Agee, Executive Director, Division of Instructional and.
Background This linked collaborative is intended to identify opportunities to exchange best practices, administrative and regulatory support models and.
Think of a good resident teacher you encountered when in medical school. What one or two things made the resident a good teacher?
The Academic Computing Assessment Data Repository: A New (Free) Tool for Program Assessment Heather Stewart, Director, Institute for Technology Development,
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
A Helpseeking Profile of International Students. Elizabeth A. Klingaman Cristina M. Risco William E. Sedlacek The University of Maryland
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
The Role of the National Authority for Quality Assurance and Accreditation (NAQAAE) in Egyptian Education   The National Authority for Quality Assurance.
INNO-MED 2nd Plenary Meeting University of Patras, Greece November 2006 Project Status University Hospital in Olomouc Czech Republic Miroslav Heřman Jarmila.
What is PIAAC?. About PIAAC PIAAC is an international large-scale assessment administered in in 23 countries It assessed 16 - to 65-year-olds,
RESULTSBACKGROUND Role of Electronic Medical Records in medical student education not clear; their use may help or hinder the educational process There.
Adapted from and Reproduced with permission from BESTEAMS Learning Styles In and Around Team Work: What’s Your Learning Pattern?
CSC Proprietary CATALYST OCMM ASSESSMENT PART OF THE CATALYST TOPIC INTRODUCTION SERIES FOR CSC INTERNAL USE ONLY.
Project NEStLeD Move Forward VIA University College Project NEStLeD NESTLED (Nurse Educator Simulation Based Learning) Project Leonardo Transfer of Innovation.
Model OAS General Assembly (MOAS). PURPOSE OF THE MOAS The Model OAS General Assembly (MOAS) is a program of the Organization of American States (OAS)
OECD/INFE toolkit to measure financial literacy and inclusion
MedEdPORTAL Module Guides Evaluation of Faculty Fellowship Background Professional development of faculty is critical to the future of health sciences.
NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup.
Nuts and Bolts of Program Sustainability Developing a Conceptual Framework to Assess the Sustainability of a Simulation Program John Gillespie, Education.
CEDA FORM 3B1 TeachingComponentsComponent Definition (brief) Role definition: Preparing and presenting relevant information on the assigned topic. Instructional.
Planning and Designing Scenario-based Simulations
The Health Metrics Network Assessment Tool. HMN Assessment Process & Tool Why use the HMN assessment tool? A step towards a comprehensive HIS vision;
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
UNICEF’s work and planned activities for the production of data on children with disabilities Claudia Cappa, Data and Analytics Section, UNICEF, NY.
© University of California San Francisco Medical School The Educator’s Portfolio: Creation and Evaluation Brian Schwartz, MD David M. Irby, PhD Kanade.
Developing Structured Activity Tools. Aligning assessment methods and tools Often used where real work evidence not available / observable Method: Structured.
Academic Program Review Chair’s Workshop John E. Sawyer, Ph.D. Associate Provost Institutional Research and Effectiveness.
RIMS Standards & Practices Committee RIMS Conference Session 2012.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
Assessment of Portal Options Presented to: Technology Committee UMS Board of Trustees May 18, 2010.
Copyright, © 2004, Theresa M. Welbourne, Ph.D. 1 HR Confidence June Leadership Pulse Dr. Theresa M. Welbourne Preliminary Report June 16, 2004.
Lessons Learned from the Society of Thoracic Surgeons (STS) Congenital Database September 25, 2015 Robert J. Dabal, MD Associate Professor of Surgery.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
1 REVIMP From review to improvement Dr. Adrie Visscher Faculty of Behavioural Sciences The Netherlands.
Implementing the NIHSS In Your Organization January 2008.
Practices and Predictors of the Use of Accommodations by University Faculty to Support College Students with Disabilities Leena Jo Landmark, M.Ed., and.
University of Michigan Medical School Academy for Educational Excellence and Scholarship Monthly Seminar – October 20, Academy Business (15 minutes)
MCCWDTA Contextualized Curriculum Regional Meeting Planning for Implementation January 2013.
© Healthcare Simulation South Carolina healthcaresimulationsc.com Greenville HealthCare Simulation Center Robert R. Morgan, Jr., MD, MBA.
Marketing Strategies for the Use of Research4Life Resources.
Kathleen Blake, MD, MPH January 15, 2016 What’s In a Certified Health IT Comparison Tool: Quality Improvement and Alternative Payment Capabilities.
Co-funded by the European Union Ref. number: LLP FI-ERASMUS-ENW WP2: Identification of Industrial Needs for Open innovation Education in.
V7 Foundation Series Vignette Education Services.
An OR Teamwork Faculty Development Program The Center for Medical Simulation’s Comprehensive Program for Operating Room Teamwork.
Use of Simulation-Based Surgical Education and Training Within the Context of the Core Competencies, Milestones, Patient Safety, and the New ACGME Accreditation.
Preparing for a Successful Site Visit Lelan F. Sillin, MD, MS(Ed), FACS ACS EI Consortium Meeting Administration & Management Workshop March 15, 2013.
11 Transforming the Lens of Pedagogy Cristi Ford, Associate Vice Provost Center for Innovation in Learning and Student Success.
Making WAVES: the progress of collaborations to further the use of Virtual Patients Sheetal Kavia e-Learning Unit St George’s, University of London, UK.
Planning Engagement Kickoff
What is PIAAC?.
Prebriefing: The Final Frontier
SAP CS ONLINE TRAINING IN USA
Stephen Fennell The Role of the Regulatory Body
Mayo Clinic, Rochester, MN
Introduction to Software Engineering
The Competencies Working Group Update: The Performance Framework
Why Accreditation? Simulation Centre / Program
Jennifer Bryer PhD, RN, CNE Virginia Peterson-Graziose DNP, RN, CNE
Program Review Teaching and learning committee Santa ana college
Planning and Designing Scenario-based Simulations
Sam Catherine Johnston, Senior TA Specialist National AEM Center
As Accessible as a Book on a Library Shelf
Cynthia Curry, Director National AEM Center
Director, National Resource Center for FYE&SIT
THE TITLE OF MY TALK John Doe, M.D. Professor Department of Neurology
Presentation transcript:

Creation of the Simulator Value Index Tool Adapted from workshop on presented by American College of Surgeons Accreditation Education Institutes, Technologies & Simulation Committee) Deborah Rooney PhD James Cooke MD Yuri Millo MD David Hananel MEDICAL SCHOOL UNIVERSITY OF MICHIGAN

Disclosures o David Hananel, No Disclosures o Yuri Millo, No Disclosures o James Cooke, No Disclosures o Deborah Rooney, No Disclosures

Overview of Main Topics o Introduction of project o Overview of 2014 IMSH Survey results o Summary of 2014 ACS Consortium results o Working meeting to refine the AVI algorithm o Apply AVI algorithm in group exercise o Discuss next steps

Introduction: How it all started o ACS AEI, Technologies and Simulation Committee o Guidelines for Simulation Development (Millo, George, Seymour and Smith) o University of Michigan o Need to support faculty in sim purchase/decision-making process (Cooke) o Discourse o Definition of “value” o Differences across stakeholder role (institution, administration, clinician, educator, researcher...)

Introduction: How it all started o Reached consensus on factors used when considering a simulator purchase o Survey 1 o IMSH general membership, N=2800 o January, 2014 o Workshop 1, n=16 o IMSH, January, 2014 o Survey 2 o ACS AEI Consortium membership, N = 455 o March, 2014 o Workshop 2, n = ? o ACS AEI-March, 2014

Introduction: The Instrument o Began with 31-item survey accessed via www (Qualtrics) o 4-point rating scale o (1 = not considered/not important  4= critical to me when I consider a simulator purchase) o 6 Domains o Cost, Impact, Manufacturer, Utility, Assessment, Environment/Ergonomics) o Demographics o Country/Institution o Stakeholder role o Involvement o Follow-up

= Grenada 1= Chile 1= Peru 1=Czech Republic 2 2=Singapore 3 = New Zealand total respondents, 72 individuals completed survey approximately 2+% of IMSH membership (2,800), 7 undesignated/16 incomplete IMSH Survey Sample: 67 institutions x 12 Countries

= Massachusetts 3 = Rhode Island 1 = New Jersey participants from US IMSH Survey Sample: 44 institutions x 22 States/US

46 58% 28 35% 26 33% 20 25% 6 8% 4 5% 1 1% n = 79 1 undesignated IMSH Survey Sample: Institution Affiliation

o Cost o Commercial Skills Centers (CSC) rated C1 (Purchase cost) lower than each of the other institutions, p =.001. o Manufacturer o CSCs rated M1 (Reputation of manufacturer) lower than each of the other institutions, p =.001. o Utility o CSCs rated U3 (Ease of data management) and o U11 (portability) lower than each of the other institutions, p =.001. o Ergonomics o Medical Schools rated item E2 (Ergonomic risk factor) much higher thank other institutions), p =.05.  CSCs rated E3 (Ease of ergonomic setup) lower than each of the other institutions, p =.001. IMSH Survey Results: Rating Differences by Institutional Affiliation

31 39% 19 24% 7 9% 8 10% 14 18% n = 79 1=undesignated IMSH Survey Sample : Stakeholder Role

o Cost o Clinicians rated C2 (Cost of warranty) lower than the other stakeholders, p =.048. o Utility o Clinicians rated U11 (portability of simulator) higher than other stakeholders, p =.037. IMSH Survey Results: Rating Differences by Stakeholder Role

37 46% 37 46% 4 5% 2 3% n = 80 IMSH Survey Sample : Involvement in Decision

o Although there are no differences across level of involvement, o There are different considerations during simulator purchasing process across; o Country o Institutional affiliation (commercial skills center may have unique needs) o Stakeholder role (Clinicians may have unique needs) o Keeping this in mind, let’s review the top factors considered IMSH Survey Results : Summary

AverageFactor (survey item number, item description)Domain Technical stability/reliability of simulatorUtility Customer service Manufacturer Ease use for instructor/administrator Utility Ease of use for learner Utility Relevance of metrics to real life/clinical setting Impact Ease of delivery and installation, orientation to simManufacturer Reproducibility of task/scenario/curriculumAssmnt/Res Purchase cost of simulatorCost Reputation of manufacturerManufacturer ScalabilityImpact Quality of tutoring/feedback from sim to learnersUtility Number of learners impactedImpact Cost of warrantyCost Cost of maintenanceCost Ease of configuration/authoring sim's learning management systemUtility -Physical durabilityUtility The SVI Factors: Top 15+1 Factors Ranked

ACS Consortium Survey: Introduction o Identical Survey items, ratings o Added durability of simulator question o 31  32-item survey accessed www (Qualtrics) o 4-point rating scale o (1 = not considered/not important  4= critical to me when I consider a simulator purchase) o 6 Domains o Cost, Impact, Manufacturer, Utility, Assessment, Environment/Ergonomics) o Demographics o Country/Institution o Stakeholder role o Involvement o Follow-up

=UK 1=France 1=Italy 1 65 total respondents, 54 individuals completed survey approximately 12% of ACS membership (455), 2 undesignated ACS Survey Sample : 41 institutions x 7 Countries 1=Greece 1=Sweden

= Massachusetts 1 = Rhode Island 1=Delaware 1 = Maryland participants from US 47 indicated institution ACS Survey Sample: 36 institutions x 17 States/US 1 1 1

37 67% 28 51% 24 44% 16 29% 2 4% 0 0% n = 55 ACS Survey Sample: Institution Affiliation 0 0%

9 16% 27 48% 1 <2% 3 <6% 13 23% n = 56 ACS Survey Sample: Stakeholder Role 2 <4% 1 <2%

25 45% 29 52% 2 3% n = 56 ACS Survey Sample: Involvement in Decision

o Although there are no differences across; o institution o stakeholder role o There are different considerations during simulator purchasing process across; o Level of involvement o (Self-reported “Responsible” folks are more concerned about number of learners impacted and Scalability) ACS Survey Results: Summary

But are there differences across IMSH and ACS membership? ACS Survey Results: Summary

Survey Results: IMSH v. ACS 4 (C2) 7 (I2)11 (M3)15 (U4) 22 (U11 )

o Cost o ACS members rated C2 (Cost of warranty) higher than the IMH members, bias =.40, p =.04. o Impact o ACS members rated I2 (Number of learners) higher than other stakeholders, bias =.53, p =.01. o Utility o ACS members rated U4 (Ease of report generation) higher than the IMH members, bias =.43, p =.02. o ACS members rated U11 (Portability of simulator) higher than other stakeholders, bias =.48, p =.01. Survey Results: Rating Differences by Conference

The SVI Factors: Top 15+1 Factors Ranked

Applying the SVI Tool o General impressions? What stood out? o What worked well? o What could have gone better? o Any surprises? o Usefulness? How might you use the SVI Tool at your institution? o Please complete the questions on “Feedback” Tab on the SVI Worksheet

Thank you: Our Contact Information o Deb Rooney University of Michigan o Jim Cooke University of Michigan o o David Hananel SimPORTAL & CREST University of Minnesota Medical School o o Yuri Millo Millo Group o Olivier Petinaux ACS American College of Surgeon, Division of Education