Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets.

Similar presentations


Presentation on theme: "Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets."— Presentation transcript:

1 Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets

2 2 Diane Hills, Ph.D. Associate Dean for Academic Affairs College of Osteopathic Medicine Diane.Hills@dmu.edu Mary Pat Wohlford-Wessels, Ph.D. Assistant Dean for Academic Quality and Medicine Education Research College of Osteopathic Medicine Mary.Wohlford-Wessels@dmu.edu

3 3 Improving medical education requires systematic processes that support the review and assessment of the work we do. The outcome of careful review supports effective strategic planning, resource allocation, resource utilization, faculty development, curricular change, research development and much more. Introduction

4 4 This presentation builds upon last year’s AACOM presentation. Last year DMU introduced its new performance improvement plan and processes. We presented our intent to implement a system of review framed within the Baldrige Quality Criteria. Since last year, we have adopted the Baldrige criteria, and now collect and format our annual Performance Improvement (PI) report within the criteria.

5 5 Last year, we introduced session participants to:  Our Committee Structure  A Gantt Chart of PI activities  Proposed data utilization  How we classified data sources into meaningful categories

6 6 Developed Annually Distributed to College and University Stakeholders 2004 - represented initial efforts 2005 – formatted using the Baldrige criteria and represented early benchmark development 2006 – will focus on clinical education and post graduate perceptions (PGY1 and Residency Directors) Performance Improvement Report

7 7 Baldrige Values Visionary Leadership Learning-Centered Education Organizational and personal learning Valuing faculty, staff and partners Agility Focus on the future Managing for innovation Management by fact Social Responsibility Focus on results Systems Perspective

8 8 Baldrige Criteria Leadership Strategic Planning Student, Stakeholder, and Market Focus Measurement, Analysis, and Knowledge Management Faculty and Staff Focus Process Management Results

9 9 What we have learned about culture and leadership Baldrige “Are We Making Progress” survey compared faculty responses to those of 228 individuals from organizations engaged in the Baldrige process. The DMU COM faculty responses were (statistically significant)  higher than national average for 1 question  lower than national average for 8 questions  equal to the national average on 30 questions

10 10 What we have learned about faculty and workload Faculty workload is quite variable, even after controlling for discipline. Approximately 45% of basic science teaching effort supports other University programs. 25% of the total teaching effort is lecture, while 44% is scheduled laboratory time. The remainder of time is dedicated to small group learning. Research growth has been dramatic due the efforts of a core group of basic science faculty.

11 11 What we have learned about student outcomes and the curriculum Students perform well on COMLEX 1 in terms of both pass rate and average score. The pass rate and average score is lower on COMLEX 2 CE and lower still on COMLEX 3. The curriculum for years 1 & 2 is well managed and faculty are responsive to needed improvement. Years 3 & 4 have received less review. New staff along with an enhanced focus will result in significant changes in the clinical portion of the curriculum.

12 12 What we have learned about OMM A survey of 3 rd year and graduating 4 th year (n=192) students regarding their OMM training revealed:  83.2% are confident in their OMM training.  84% said only a small percentage (0-25%) of their DO preceptors used OMM in their practice.  67.5% said that they rarely or never had an opportunity to use OMM during the clinical portion of their training. What does this mean for our curriculum? What should this mean for the profession?

13 13 What we have learned regarding research growth Research/scholarship productivity continues to grow. The National Center for Higher Education Management Systems (NCHEMS) data indicates that DMU-COM funding is competitive with peer private osteopathic colleges.

14 14 The College Mission statement needed to be revised. The Vision statement needed to be revised. Values statements needed to be written. What we have learned about our mission and vision

15 15 DMU-COM Data (03-04) AAMC – Allopathic Medical School Data (04- 05) NBOME (03-04) Residency Directors (05-06) AACOM – Osteopathic Medical School Data (03- 04) NCHEMS (03-04) Data Development and Growth

16 16 Where does DMU rank? DMU tracks public information on the following schools:  Arizona College of Osteopathic  College of Osteopathic Medicine of the Pacific  Touro University College of Osteopathic  Nova Southeastern  Chicago College of Osteopathic Medicine  Des Moines University  Pikeville College School of Osteopathic Medicine  University of New England  Michigan State University College of Osteopathic Medicine  Kirksville College of Osteopathic Medicine  UNDNJ School of Osteopathic Medicine  New York College of Osteopathic Medicine  Ohio University College of Osteopathic Medicine  Oklahoma State University College of Osteopathic Medicine  Philadelphia College of Osteopathic Medicine  University of North Texas Health Sciences Center  West Virginia School of Osteopathic Medicine

17 17 Where does DMU rank?

18 18 Where does DMU rank?

19 19 Where does DMU rank?

20 20 Begin to develop correlations between clinical experiences and student clinical outcomes. Further collect and analyze graduate feedback (performance perceptions from graduates and residency directors) Begin to develop assessment research methods to determine the effectiveness of utilizing patient simulators. Next Steps

21 21 Continue to refine the Faculty Adequacy (Workload) Model Use existing information about research productivity to develop research related targets Investigate the use of faculty e-portfolios Investigate the use of student e-portfolios Continue to develop the Lecture Level Database (LLDB) to better manage the assessment of objectives and competencies Next Steps

22 22 Summary The process adopted several years ago and perfected over the past two year has resulted in the college knowing more about outcomes and operations. We have become more sophisticated in our collection and use of data. We are using data more and more to make decisions.


Download ppt "Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets."

Similar presentations


Ads by Google