Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas.

Similar presentations


Presentation on theme: "Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas."— Presentation transcript:

1 Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

2 Historical Perspective & Background Early approaches to educational program evaluation Ralph Tyler’s behavioral objective model

3 A Classic Model: The Tyler Model Often referred to as “objective model” Emphasis on consistency among objectives, learning experiences, and outcomes Curriculum objectives indicate both behavior to be developed and area of content to be applied (Keating, 2006)

4 Tyler’s Four Principles of Teaching Principle 1: Defining appropriate learning objectives Principle 2: Establishing useful learning experiences Principle 3: Organizing learning experiences to have a maximum cumulative effect Principle 4: Evaluating the curriculum and revising those aspects that did not prove to be effective (Keating, 2006)

5 Primary Strengths of Tyler’s Model Clearly stated objectives a good place to begin Involves the active participation of the learner (Prideaux, 2003) Simple linear approach to development of behavioral objectives (Billings & Halstead, 2009)

6 Progression of Program Evaluation 1980’s Outcome assessments State legislatures National League from Nursing (NLN) CCNE – 1990’s

7 Program Evaluation: 2000 Sauter (2000) surveyed all baccalaureate nursing programs in the United states to determine how they develop, implement, and revise their program evaluation plans. In 2006 Suhayda and Miller reported on the use of Stufflebeam’s CIPP model in providing a frame work for comprehensive program evaluation that would serve undergraduate and graduate nursing programs.

8 Relevance & Justification

9 Program Evaluation Set Expectations Collect Data Use Data

10 Importance of Evaluation

11 Relevance

12 The ABC’S of EVALUATION

13 Impact to Program Evaluation PurposeScopeFocus Program Design

14 Focus of Impact Evaluation Participant’s perception and satisfaction Participant’s beliefs about teaching and learning Participant’s teaching performance Student’s perceptions about staff teaching performance Student’s learning Effects on the culture of institution

15 Resources to Conduct Impact Evaluation Reliable and valid instruments Trained data collectors Personnel with research and statistical expertise Equipment for data collection Equipment for data collection analysis

16 When to do Impact Evaluation 1.New program added to curriculum 2.Pilot programs which are due to be markedly scaled up 3.Ongoing program

17 Curriculum designDiscipline of knowledgeCharacteristics of discipline Evaluation on Curriculum

18 Conclusion Program evaluation is collaborative, comprehensive, and complex. By understanding the history of program evaluation we can better understand the theory behind it. http://teaching.berkeley.edu/sites/teaching.berkeley.edu/files/evaluationFINAL2.png

19 Conclusion Evaluation should focus on a specific purpose with the goal of long-term improvement. Evaluators must consider program values along with societal expectations.

20 Conclusion “Development and implementation of a carefully designed theory-driven program evaluation plan will support continuous quality improvement for nursing education programs” (Billings & Halstead, 2009, p. 507). Assess Plan Improve

21 References Bastable, S.B. (2013). Nurse as educator. (4th ed.). Sudbury, MA: Jones and Bartlett. Billings, D. M., & Halstead, J. A. (2009). Teaching in nursing: A guide for faculty (3rd ed.). St. Louis, MO: Elsevier Saunders. Denham, T.J. (2002). Comparison of two curriculum/Instructional Design Models: Ralph W. Tyler and Siena College Accounting Class, ACCT205. Retrieved from ERIC Database. (ED 471734) Educational Development Programs, 6 (2), 96-108. Retrieved from: http://www.tandfonline.com/doi/abs/10.1080/13601440110090749http://www.tandfonline.com/doi/abs/10.1080/13601440110090749 Keating, S. (2006). Curriculum development and evaluation in nursing. Philadelphia, Pennsylvania: Lippincott Williams & Wilkins. Klein, C., (2006). Linking competency-based assessment to successful clinical practice. Journal of Nursing Education.45(9), 379-383. McDonald, Mary C. (2014). Guide to assessing learning outcomes. (3rd ed.). Sudbury, MA: Jones and Bartlett. Northeastern Illinois University. (n.d.). Classical Model. Ralph Tyler, 1949, Book Summary. Retrieved from www.neiu.edu/~aserafin/New%20Folder/TYLER.html Oermann, M., & Gaberson, K. (2006). Evaluation and testing in nursing education. (2nd ed.). New York, NY: Springer Publishing Company, Inc. Outline of principles of impact evaluation. (n.d.). Retrieved from http://www.oecd.org/dac/evaluation/dcdndep/37671602.pdfhttp://www.oecd.org/dac/evaluation/dcdndep/37671602.pdf Prideaux, D. (2003). Curriculum design: ABC of learning and teaching in medicine. British Medical Journal, 326(7383), 268-270. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1125124/?tool=pubmed Ross, A. (2010). Survey data collection for impact evaluation. Retrieved from: http://siteresources.worldbank.org/EXTHDOFFICE/Resources/5485726-1256762343506/6518748-1292879124539/25.Collecting-Quality-Data-for- Impact-Evaluation_Adam http://siteresources.worldbank.org/EXTHDOFFICE/Resources/5485726-1256762343506/6518748-1292879124539/25.Collecting-Quality-Data-for- Impact-Evaluation_Adam University of South Florida College of Education. (n.d.). Ralph Tyler’s little book. Retrieved from www.coedu.usf.edu/agents/dlewis/publications/tyler.htm www.coedu.usf.edu/agents/dlewis/publications/tyler.htm


Download ppt "Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas."

Similar presentations


Ads by Google