The KirkPatrick Model and the Beneficence Principle Mary Kobusingye Examining the Evaluation Process :

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Performance Assessment
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Introduction to Values and Ethics 2 - Ethics in Practice Presented by – Date – Ethics, too, are nothing but reverence for life. This is what gives me the.
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
Clinical Supervision Foundations Module Six Performance Evaluation.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Online Career Assessment: Matching Profiles and Training Programs Bryan Dik, Ph.D. Kurt Kraiger, Ph.D.
Research Paper Critical Analysis Research Paper Critical Analysis 10 ways to look at a research paper systematically for critical analysis.
Assessment of Clinical Competence in Health Professionals Education
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
6 Chapter Training Evaluation.
Chapter 6 Training Evaluation
Chapter 8 Global Sales Training Sales Management: A Global Perspective Earl D. Honeycutt John B. Ford Antonis Simintiras.
SCHINDLER Sales Force Training Needs Assessment and Development Project Michael Yurchuk Sales Training Manager, Schindler Elevator Richard Dapra Ph.D.,
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
Planning an Internal Audit JM García Merced. Brainstorm.
Building Internally Consistent Compensation Systems
Performance Appraisal Notes & Concept Main features Applications of results of appraisal Potential benefits/advantages Potential complications.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Ethical and Risk-Management Issues in Social Work: Essential Knowledge   West Lake Park Drive,
“Putting the pieces together – as a community” December, 2014.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
Working Definition of Program Evaluation
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
© 2006 Prentice Hall Leadership in Organizations 4-1 Chapter 4 Participative Leadership, Delegation, and Empowerment.
Copyright © 2010 Pearson Education, Inc. Leadership in Organizations publishing as Prentice Hall 4-1 Chapter 4 Participative Leadership, Delegation, and.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Julie R. Morales Butler Institute for Families University of Denver.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Using Participatory Action Research to Develop and Validate the Core Competency Measure (CCM) Stephen S. Leff, Ph.D., Nathan Blum, M.D., Abbas Jawad, Ph.D.,
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Guiding Principles We, the members of CPED, believe "The professional doctorate in education prepares educators for the application of appropriate and.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Evaluation of Strategic HRD Chapter 11. Why Evaluate ? The Purpose of Evaluation: Viewpoints & Challenges Evaluation is a core part of what makes us compete.
Chapter 6 Training Evaluation
Systems Analysis: Organizations are systems.. 2 Types We commonly think of at least 2 things when we think of systems in organizations: 1.Most commonly.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Program Evaluation Principles and Applications PAS 2010.
Assessment in Counseling Chapter 1 Levi Armstrong, Psy.D.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
0 Ethics Lecture Essentials of Informed Consent. ACADEMY OF OPHTHALMOLOGY The speaker has no financial interest in the subject matter.
A Comprehensive Framework for Evaluating Learning Effectiveness in the Workplace Presented by Dr Cyril Kirwan.
Social Work Competencies Social Work Ethics
WEEK 4: 1/26/15 – 1/30/15 PSYCHOLOGY 310: SPORT & INJURY PSYCHOLOGY UNIVERSITY OF MARY INSTRUCTOR: DR. THERESA MAGELKY Psychological Responses to Injury/
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
The Learning Practitioner: A performance in five acts
Performance Appraisal & Workplace Performance
Assessment in Counseling
Chapter 8 Selection. chapter 8 Selection Selection – the process by which an organization chooses from a list of applicants the person or persons who.
Regulatory agency and pharmaceutical company responses mapped to the 10 Quality Decision Making Practices (n=76) Legend Best practice Needs improvement.
Fundamentals of a Vocational Assessment
Concept of Test Validity
Introduction to values and ethics in career development 2 Presented by – Date – Ethics, too, are nothing but reverence for life. This is what gives.
Brotherson, S., Kranzler, B., & Zehnacker, G.
Introduction to Client Assessment
HRD Evaluation Introduction.
Chapter Six Training Evaluation.
SPIRIT OF HR.in TRAINING EVALUATION.
Taking the STANDARDS Seriously
6 Chapter Training Evaluation.
Presentation transcript:

The KirkPatrick Model and the Beneficence Principle Mary Kobusingye Examining the Evaluation Process :

Overview Purpose of this paper is to examine the KirkPatrick's model, the reasons for its popularity in organizations, its limitations and potential risks it raises for evaluation clients and stakeholders. Evaluation models such as the KirKPatrick’s model need to be subjected to fundamental ethical questions about evaluation: Is the right thing being done and is it being done well? The extent to which the KirkPatrick’s model is consistent with the principle of the beneficence.

KirKPatrick’s four levels of evaluation

Popularity of the four level model It has addressed the need of training professionals to understand training evaluation in a systematic way Emphasis on level 4 as providing the most important information Simplifies complex process of training evaluation

Limitations of KirkPatrick’s four level model The incompleteness of the model The assumption of casual linkages Importance of information across ascending levels

Beneficence principle It is the role of every training profession to advance the welfare of all individuals and organizations for whom they work. It is an ethical duty to help others further their important and legitimate interests and to confer benefits to clients and stake holders when possible. Training evaluators should help organizations in determining if the program was effective and what can be done to improve the training process. Failure to provide a benefit when in position to do so is a violation of professional ethics.

KirkPatrick’s Model and beneficence principle The principle could be used to assess the potential risks versus benefits associated with the KirkPatrick’s model Limitations of the model could prevent the evaluators from adequately addressing the principle of the beneficence o Model assumes evaluator can measure one of four levels and it would provide adequate information,;key conceptual factors are left out and may lead to misleading judgment o The casual linkage assumption infers a connection between reaction measures and unmeasured outcomes at other levels that could lead to inaccurate view of training effectiveness and pause a risk to clients

Continued o Assumption of level 4 being most crucial pauses substantial risk for clients and stakeholders as any other factors could cause changes in financial and other performance measures at the organizational level o The assumption also ignores the potential differences in the views of stake holders about training outcomes and what is important to measure in assessing the effectiveness of training outcomes. The risk that could be faced by an organization that utilizes such an assumption has the likely hood of undermining one of the main consequences of training evaluation; the utilization of evaluative findings

Suggestions Training evaluators have ethical obligations to improve their models and practice in ways that increase capacity to more meaningfully benefit clients and stake holders. Kauffman extended the level one of the KirkPatricks model to include the valuation of resources and the fifth level that is concerned with societal impact. Utilizing valid reliable and easy to use assessment scales and instruments that compliment such models can help training evaluators examine a range of key input variables, eg.

Continued o Research has developed instruments for measuring pre- training factors, factors affecting learning transfer and other contextual factors influencing training effectiveness o Kraiger et al.(1995 )used a method for assessment of individual trainee’s domain specific knowledge and skills where as other researchers have provided tools for more accurately assessing the multidimensionality of participant reaction measures and models for thinking about multiple dimensions of job performance.

References 1. Alliger, G. M., & Janak, E. A. (1989). Kirkpatrick's levels of training Criteria: Thirty years later. Personnel Psychology, 42(2), doi: /j tb00661.x 2. Bates, R. (2004). A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence. Evaluation and Program Planning, 27(3), doi: /j.evalprogplan Beauchamp, T. L., & Childress, J. F. (1983). Principles of biomedical ethics (2nd ed). New York: Oxford University Press. 4. Goldenstein, I. L., & Ford, J. K. (2002). Training in organizations. Belmont, CA: Wadsworth. 5. Kaufman, R., & Keller, J. M. (1994). Levels of evaluation: Beyond Kirkpatrick. Human Resource Development Quarterly, 5(4), Doi: /hrdq