Presentation is loading. Please wait.

Presentation is loading. Please wait.

Maryellen E. Gusic MD Associate Dean, Clinical Education Professor of Pediatrics Penn State College of Medicine.

Similar presentations


Presentation on theme: "Maryellen E. Gusic MD Associate Dean, Clinical Education Professor of Pediatrics Penn State College of Medicine."— Presentation transcript:

1 Maryellen E. Gusic MD Associate Dean, Clinical Education Professor of Pediatrics Penn State College of Medicine

2 The work and contributions of educators must be visible to be valued “…we cannot value something that we cannot share, exchange, examine.” Lee Shulman 1990

3 Acknowledgements Connie Baldwin PhD, University of Rochester Medical Center Latha Chandran MD, MPH, Stony Brook University Medical Center Co-leaders of the Academic Pediatric Association Educational Scholars Program

4 Do educators in academic health centers have time for scholarship? Is their contribution to the quality of future physicians valued? Barchi and Lowery. Academic Medicine 2000 “The growing emphasis on delivery of clinical services and the concomitant decrease in time for tenured and clinician-educator faculty to teach and do scholarly work jeopardizes both the potential for continued discovery and the education of the next generation of medical scholars.”

5 Are educators under-developed as academicians? Promotion criteria for clinician educators examined by Beasley et al. in 1997 Importance of criteria for assessment (scale of 1-7) teaching skills (6.3) clinical skills (5.8) development of educational programs (5.3) nonresearch scholarship (5.1) education research (4.5) Tools used to evaluate teaching: awards, peer evaluation, learner evaluation, teaching portfolio

6 Academic advancement slower for clinician educators Thomas et al. Academic Medicine 2004 Odds of being at a higher rank were 85% less for academic clinicians and 69% less for teacher clinicians than for basic researchers Adjusted for age, gender, time in rank and work satisfaction Satisfaction with progress towards academic promotion 92% lower for academic clinicians and 87% lower for teacher- clinicians Rigor of promotion progress lessened by paucity of valid evaluation methods for teaching and clinical practice

7 There are problems with the current systems of recognition for clinician- educators Levinson and Rubenstein 2000 Lack of reliable measures of teaching excellence Lack of valid methods that measure outcomes of teaching and educational programs Lack of congruence between job responsibilities and criteria by which faculty are judged for promotion

8 Judgments must be based on explicit criteria Faculty members, department chairs and P&T committee chairs and members may have differing definitions of excellence In addition, there may be differing opinion/perception of the relative value of educational contributions in the P&T process Work often discounted because it is not documented adequately or not understood by P&T committee members

9 First step: Expanding the definition of scholarship In 1990, Boyer challenged the concept that teaching is simply an expected task performed by all academic physicians Expanded definition of scholarship to include the scholarship of application, integration and teaching in addition to the scholarship of discovery Reality: scholarship of discovery often most valued realm in academic institutions

10 “The elusiveness of the scholarship of teaching” Glassick. Academic Medicine 2000 Adoption of Boyer’s expanded definition of scholarship challenged by: Agreement about the meaning of this category of scholarship Agreement about how quality should be measured Excellent teaching is not the same as the scholarship of teaching

11 Glassick created an “equal playing field” by establishing common criteria for scholarship Clear goals Adequate preparation Appropriate methods Significant results Effective presentation Reflective critique

12 One solution used by academic health centers: the creation of various promotion tracks Nora et al. Academic Medicine 2000 Challenges of different tracks Perceived value/status Tenure eligibility Congruence of expectations for performance with assigned activities of faculty members Ability to change tracks as careers evolve over time Separate promotion tracks less important than “appropriate methods to evaluate” performance Beasley et al. JAMA 1997

13 Promotion committees must accept an expanded definition of scholarship Criteria for promotion must include the scholarship of teaching Educational “credits” are more difficult to document than research “credits” Documentation standards must allow for methods that establish the quality and impact of the work of educators

14 Challenges of traditionally accepted academic documents CV mainly documents educational quantity (countable data) CV does not typically allow flexibility to document quality and impact measures of educational activities Challenge for educators to provide evidence that demonstrates a scholarly approach using traditional formats Use of grants and publications as only markers of scholarship inadequate in capturing the work of educators

15 Educator Portfolios (EPs) show quantity, quality, and impact of an educator’s work Documentation template that allows faculty to make their educational activities and accomplishments visible and to establish impact Prove value

16 EPs have multiple uses For use in P&T process For annual performance review Negotiating for a new position, raise or time for educational work For goal setting and meeting with a mentor/advisor For writing a biographical sketch or grant proposal For updating your cv For award nominations For applying for a new job

17 Developmental vs Promotional EPs Developmental EPs Promotional EPs Formative document Provides broad perspective Helps to strategically plan career and intentionally plan educational work Tracks over time Aids in reflective practice Serves as communication tool with mentors Foundation for developing promotional EP Summative document Highlights, summarizes major accomplishments and key achievements Short, focused presentation Personal statement to provide context for review of work Summarized evidence of quality and effectiveness

18 The use of EPs in the P&T process in US medical Schools Simpson et al. Academic Medicine 2004 400% increase since 1992 in the number of schools using portfolios in promotion packets Observations: Dissemination of work important factor for inclusion Infrequent use of outcome measures or internal/external review of educational work

19 Consistency of categories included in EPs but limited consensus on types of evidence to prove quality and impact Interviews of faculty responsible for appointments/promotions revealed that excellence was not explicitly defined “ ‘We know what we want to look for…but it is not really codified…” “ ‘We gave up defining scholarship because it was eating up so much time and we could not get consensus. We have just been going ahead with the art and the “we know it when we see it” approach.’”

20 Lack of accepted common terminology, lack of standards for documentation and lack of guidelines and criteria for the evaluation of the content of EPs limits their success in accomplishing this goal

21 Documentation standards for educators explored in 2006 in a Consensus Conference on Educational Scholarship Convened by AAMC Group on Educational Affairs

22 Affirmation of 5 categories of educational activity and accomplishment Teaching Curriculum Advising and/or mentoring Educational leadership and/or administration Learner assessment

23 Excellence requires: “Q 2 Engage” Quantity Measures of the types and frequencies of activities and roles Quality Evidence of effectiveness using comparative measures Evidence of engagement with the community of educators

24 Engagement measured through a scholarly approach and scholarship Use of a scholarly approach demonstrated through evidence that one’s work builds on the work of others Scholarship requires “P 3 ”: Public display Peer review Dissemination= creating a platform upon which others can build

25 A scholarly approach is proactive and reflective Evidence of a systematic approach using best practices or information from the literature Reflective practice: using self assessment and information from others to enhance future educational efforts

26 Dissemination of scholarly products allows peer review Peer review uses accepted criteria of evaluation To be considered scholarship, products must be presented in a peer reviewed venue or repository Allows use of product by others Allows to build upon the work of the scholar

27

28 Next step: Development of an accepted set of standards by which to value the work of educators Faculty would better understand expectations for performance and judgment criteria Self assessment allows faculty to build skills in an organized fashion Educational programs would improve Development, implementation and evaluation of the programs would consider guidelines for excellence and a scholarly approach Faculty and evaluators would share a common language Education would be seen and valued as a viable career track in academic medicine

29 Criteria for the evaluation of educators can be refined Fincher et al. Academic Medicine 2000 The work of educators must be evaluated to be recognized and rewarded Effectiveness of teaching must be “rigorously substantiated” The results of educational leadership must be “demonstrable and broadly felt” The advancement of learning must be measured to assess educational methods and programs

30 Although more widely used, EPs lack a widely accepted, standardized format EPs remain difficult to assess in the absence of recognized standards for documentation and evaluation

31 31 Academic Pediatric Association (APA) Educational Scholars Program: An EP “test tube” ESP is a national faculty development program for pediatric educators We developed an EP template for use by our scholars Structure for systematically presenting numeric and narrative data

32 APA EP template peer reviewed and published on MedEdPortal Inclusion of 5 standard domains Additional items Educational philosophy statement Evolves from an understanding of theory and best practices combined with experience and reflection on teaching Five year goals as an educator Evidence of scholarly accomplishment http://www.ambpeds.org/site/education/education_faculty_dev_template.htm

33 We have also created a systematic tool for analysis of EPs: The APA EP Analysis Tool Peer reviewed and published on MedEdPortal http://www.ambpeds.org/site/education/education_faculty_dev_template.htm

34 Use of a parallel template for the portfolio and the analysis tool allows valid and reliable evaluation Analysis tool Allows reproducible analysis for use across disciplines and across institutions Promotes same methodology used in the evaluation of researchers Principles which guided the development of each: Use of measurable outcomes to demonstrate impact Quantitative and qualitative measures to ensure objective analysis

35 The analysis tool was developed through a formal consensus building process Academic Medicine 2009 Multiple rounds of item development and selection L. Chandran, C. Baldwin, T. Turner, E. Zenni, L. Lane, D. Balmer, M. Bar-on, D. Rauch, D. Indyk, L. Gruppen Enhancement of template to improve the quality of information available for review Creation of a set of instructions for use of the tool to promote reliable application of standards

36 List of >100 Quantitative Items List of 52 Qualitative Items Tool 1.1: Selected & Combined 43 Items Tool 1.2: Tested, Refined & Reconciled 48 Items Tool 2.0: 36 items* MedEdPortalApprovalMedEdPortalApproval Step 4 2 EPs 8 raters Step 3 3 EPs 3+ 2 raters Step 5 15-20 EPs 8 raters Tool Development Inter-rater Reliability Testing EP Template Revision Step 1 27 EPs 6 raters Step 2 5 EPs 4 raters EP Template Revision Template Development EP Template

37 Analysis tool item summary 18 quantitative items including index scores that combine related measures Weights used for index scores are calibrated across the tool to ensure equivalence 18 qualitative items measured using three point scale (novice/intermediate/expert) Intermediate rating defined with verbal specifications

38 Guidelines for choice of standards Quantitative measure was used if it was valid, important, and could be reliably measured Qualitative measure to capture information that was not readily quantifiable Structured reporting format required for qualitative assessment Accepted constructs applied to enhance the credibility of qualitative standards Miller’s criteria for learner assessment strategies GNOME model for curriculum design

39 Measurement of scholarly activity Scholarly approach to education Entire EP reviewed Special attention to educational philosophy, 5 year goals, narrative comments which follow each domain Evidence of reflective practice and use of “best practices” from the literature Special consideration of educator’s focal educational effort Assessed using framework for excellence established by Glassick

40 Evaluation of a scholarly approach EP content analyzed for: Evidence of systematic planning Consultation with literature/best practices Rigorous measurement of educational quality and outcomes Products/methods assessed through peer review Presentations, publications, adoption of products by others

41 Products of educational scholarship Includes peer reviewed publications, presentations, and disseminated educational products adapted by others Public dissemination, peer review and platform for others

42 Lessons learned in the development of the analysis tool Focused selection of essential items makes tool practical Quantitative items are based on judgment of quality not just numbers Specification of qualitative ratings is critical to achieve concordance Qualitative items must be recorded numerically to give them equivalence with quantitative items The ability to use the tool is dependent on the quality of the data submitted—information must be documented meticulously

43 The goal of our current project: To create a set of general principles and specific criteria for faculty evaluation regardless of template used to document educational activities To promote a common understanding using a common and established vocabulary for excellence To allow individual institutions to set and apply fair and rational standards for consistent decision-making To encourage continued conversation among the community of educators To offer a sample tool based on the principles discussed

44 Purpose of the current project To establish a sound foundation for academic promotion and advancement of educators To provide a framework for the systematic analysis of educator performance

45 We do not expect a national consensus about precise criteria for the advancement of educators Principles must be applied with consideration of the individual culture of each institution Expectations for faculty performance Needs of the educational mission

46 “Successful educators…need resources to fulfill the educational mission.” Simpson et al. Summary Report from the Consensus Conference on Educational Scholarship 2007 “We must evolve continuously our organizational structures, human resources activities, political coalitions, and symbols to support scholarship in education.” Fincher et al Academic Medicine 2000

47 Development of a sound rating system will require the institution to develop and implement: Accurate and complete data sources Definition and acceptance of specific criteria for evaluation Consistent application of criteria Inclusion of quantitative and qualitative measures

48 Additional topics for conversation Institutional requirements/preferences need to be considered in developing the rating system Should each domain be assigned an equal value? Should faculty members be expected to be active/demonstrate excellence in each domain? It is unlikely that a faculty member will display equivalent performance in all of the domains included in an EP How many areas of excellence are required for advancement?

49 How should the value of each domain be established? National consensus + local considerations Scholarly approach vs Products of Scholarship While both are important parts of establishing the credentials of an educator, products of scholarship carry more value than use of a scholarly approach

50

51 Principles for educator evaluation Evaluations must be based on objective criteria Use both quantitative and qualitative measures Expect educators to plan systematically to help learners achieve specific, evaluable learning objectives Expect scholarly activity from all faculty Evaluate scholarship rigorously Expect variation among educators Inform faculty of criteria Educate those who evaluate educators to recognize superior performance


Download ppt "Maryellen E. Gusic MD Associate Dean, Clinical Education Professor of Pediatrics Penn State College of Medicine."

Similar presentations


Ads by Google