We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byMarjory Bell
Modified about 1 year ago
1 © 2013 by Nelson Education Ltd. CHAPTER ELEVEN Training Evaluation
2 © 2013 by Nelson Education Ltd. LEARNING OUTCOMES Define training evaluation and the main reasons for conducting evaluations Discuss the barriers to evaluation and the factors that affect whether or not an evaluation is conducted Describe the different types of evaluations Describe the models of training evaluation and the relationship among them
3 © 2013 by Nelson Education Ltd. LEARNING OUTCOMES Describe the main variables to measure in a training evaluation and how they are measured Discuss the different types of designs for training evaluation as well as their requirements, limits, and when they should be used
4 © 2013 by Nelson Education Ltd. INSTRUCTIONAL SYSTEMS DESIGN MODEL
5 © 2013 by Nelson Education Ltd. INSTRUCTIONAL SYSTEMS DESIGN MODEL Training evaluation is the third step of the ISD model and consists of two parts: The evaluation criteria (what is being measured) Evaluation design (how it will be measured) These concepts are covered in the next two chapters Each has a specific and important role to play in the effective evaluation of training and the completion of the ISD model
6 © 2013 by Nelson Education Ltd. TRAINING EVALUATION Process to assess the value – the worthiness – of training programs to employees and to organizations
7 © 2013 by Nelson Education Ltd. TRAINING EVALUATION Not a single procedure; a continuum of techniques, methods, and measures Ranges from simple to elaborate procedures The more elaborate the procedure, the more complete the results, yet usually the more costly (time, resources) Need to select the procedure based on what makes sense and what can add value within resources available
8 © 2013 by Nelson Education Ltd. WHY A TRAINING EVALUATION? Improve managerial responsibility toward training Assist managers in identifying what, and who, should be trained Determine cost–benefits of a program Determine if training program has achieved expected results Diagnose strengths and weaknesses of a program and pinpoint needed improvements Justify and reinforce the value of training
DO WE EVALUATE? There has been a steady decline in determining ROI – Level 3 and 4 evaluation 9 © 2013 by Nelson Education Ltd.
10 © 2013 by Nelson Education Ltd. BARRIERS TO EVALUATION Barriers fall into two categories: 1.Pragmatic Requires specialized knowledge and can be intimidating Data collection can be costly and time consuming 2.Political Potential to reveal ineffectiveness of training
11 © 2013 by Nelson Education Ltd. TYPES OF EVALUATION Evaluations may be distinguished from each other with respect to: 1.The data gathered and analyzed 2.The fundamental purpose of the evaluation
12 © 2013 by Nelson Education Ltd. TYPES OF EVALUATION 1.The data gathered and analyzed a.Trainee perceptions, learning, and behaviour at the conclusion of training b.Assessing psychological forces that operate during training c.Information about the work environment Transfer climate and learning culture
13 © 2013 by Nelson Education Ltd. TYPES OF EVALUATION 2. The purpose of the evaluation a.Formative: Provide data about various aspects of a training program b.Summative: Provide data about worthiness or effectiveness of a training program c.Descriptive: Provide information that describes the trainee once they have completed a training program d.Causal: Provide information to determine if training caused the post-training behaviours
14 © 2013 by Nelson Education Ltd. MODELS OF EVALUATION A.Kirkpatrick’s Hierarchical Model Oldest, best known, and most frequently used model The Four Levels of Training Evaluation: –Level 1: Reactions –Level 2: Learning –Level 3: Behaviours –Level 4: Results –ROI
15 © 2013 by Nelson Education Ltd. CRITIQUE OF EVALUATION There is general agreement that the five levels are important outcomes to be assessed There are some critiques: Doubt about the validity Insufficiently diagnostic Kirkpatrick requires all training evaluations to rely on the same variables and outcome measures
16 © 2013 by Nelson Education Ltd. MODELS OF EVALUATION B.COMA Model A training evaluation model that involves the measurement of four types of variables 1.Cognitive 2.Organizational Environment 3.Motivation 4.Attitudes
17 © 2013 by Nelson Education Ltd. MODELS OF EVALUATION The COMA model improves on Kirkpatrick’s model in four ways: 1.Transforms the typical reaction by incorporating greater number of measures 2.Useful for formative evaluations 3.The measures are known to be causally related to training success 4.Defines new variables with greater precision Note: Relatively new model – too early to draw conclusions as to its value
18 © 2013 by Nelson Education Ltd. MODELS OF EVALUATION C. Decision-Based Evaluation Model A training evaluation model that specifies the target, focus, and methods of evaluation
19 © 2013 by Nelson Education Ltd. MODELS FOR TRANSFER Decision-Based Evaluation Model Goes further than either of the two preceding models: 1.Identifies the target of the evaluation –Trainee change, organization payoff, program improvement 2.Identifies its focus (variables measured) 3.Suggest methods 4.General to any evaluation goals 5.Flexibility: Guided by target of evaluation
20 © 2013 by Nelson Education Ltd. MODELS FOR TRANSFER As with COMA, the DBE model is recent and will need to be tested more fully All three models require specialized knowledge to complete the evaluation; this can limit their use in organizations without this knowledge Holton and colleagues’ Learning Transfer System Inventory (seen in Chapter 10) provides a more generic approach See Training Today 11.2 for more on its use for evaluation
21 © 2013 by Nelson Education Ltd. MODELS FOR TRANSFER Training evaluation requires data be collected on important aspects of training Some of these variables have been identified in the three models of evaluation A more complete list of variables is presented in Table 11.1, and Table 11.2 shows sample questions and formats for measuring each type of variable
22 © 2013 by Nelson Education Ltd. EVALUATION VARIABLES A.Reactions B.Learning C.Behaviour D.Motivation E.Self-efficacy F.Perceived/anticipated support G.Organizational perceptions H.Organizational results See Table 11.2 in text
23 © 2013 by Nelson Education Ltd. VARIABLES A. Reactions 1.Affective reactions: Measures that assess trainees’ likes and dislikes of a training program 2.Utility reactions: Measures that assess the perceived usefulness of a training program
24 © 2013 by Nelson Education Ltd. VARIABLES B.Learning Learning outcomes can be measured by: 1.Declarative learning: Refers to the acquisition of facts and information, and is by far the most frequently assessed learning measure 2.Procedural learning: Refers to the organization of facts and information into a smooth behavioural sequence
25 © 2013 by Nelson Education Ltd. VARIABLES C. Behaviour Behaviours can be measured using three approaches: 1.Self-reports 2.Observations 3.Production indicators
26 © 2013 by Nelson Education Ltd. VARIABLES D.Motivation Two types of motivation in the training context: 1.Motivation to learn 2.Motivation to apply the skill on-the-job (transfer) E.Self-Efficacy Beliefs that trainees have about their ability to perform the behaviours that were taught in a training program
27 © 2013 by Nelson Education Ltd. VARIABLES F. Perceived and/or Anticipated Support Two important measures of support: 1.Perceived support: The degree to which the trainee reports receiving support in attempts to transfer the learned skills 2.Anticipated support: The degree to which the trainee expects to supported in attempts to transfer the learned skills
28 © 2013 by Nelson Education Ltd. VARIABLES G. Organizational Perceptions Two scales designed to measure perceptions: 1.Transfer climate: Can be assessed via a questionnaire that identifies eight sets of “cues” 2.Continuous learning culture: Can be assessed via questionnaire presented in Trainer’s Notebook 4.1 in Chapter 4 of the text
29 © 2013 by Nelson Education Ltd. VARIABLES G. Organizational Perceptions (cont'd) Transfer climate cures include: Goal cues Social cues Task and structural cues Positive feedback Negative feedback Punishment No feedback Self-control
30 © 2013 by Nelson Education Ltd. VARIABLES H.Organizational Results Results information includes: 1.Hard data: Results measured objectively (e.g., number of items sold) 2.Soft data: Results assessed through perceptions and judgments (e.g., attitudes) 3.Return on expectations: Measurement of a training program’s ability to meet managerial expectations
31 © 2013 by Nelson Education Ltd. DESIGNS IN TRAINING EVALUATION The manner with which the data collection is organized and how the data will be analyzed All data collection designs compare the trained person to something
32 © 2013 by Nelson Education Ltd. DESIGNS IN TRAINING EVALUATION 1.Non-experimental designs: Comparison is made to a standard and not to another group of (untrained) people 2.Experimental designs: Trained group compared to another group that does not receive the training – assignment is random 3.Quasi-experimental designs: Trained group is compared to another group that does not receive the training; assignment is not random
33 © 2013 by Nelson Education Ltd. DATA COLLECTION DESIGN
34 © 2013 by Nelson Education Ltd. DATA COLLECTION DESIGN PrePost A: Single group post-only design (Non-experimental) B: Single group pre-post Design (Non-experimental)
35 © 2013 by Nelson Education Ltd. DATA COLLECTION DESIGN PrePost C: Time series design (Non-experimental) D: Single group design with control group TrainedUntrained
36 © 2013 by Nelson Education Ltd. DATA COLLECTION DESIGN PrePost E: Pre-post design with control group F: Time series design with control group TrainedUntrained
37 © 2013 by Nelson Education Ltd. DATA COLLECTION DESIGN PrePost G: Internal Referencing Strategy Training on Relevant Items Training on Irrelevant Items
38 © 2013 by Nelson Education Ltd. SUMMARY Discussed the main purposes for evaluating training programs as well as the barriers Presented, critiqued, and contrasted three models of training (Kirkpatrick, COMA, and DBE) Recognized that Kirkpatrick model is most frequently used, yet has limitations Discussed the variables required for an evaluation as well as methods and techniques required to measure them Presented the main types of data collections designs Discussed factors influencing choice of data collection designs
1 © 2010 by Nelson Education Ltd. Chapter Eleven Training Evaluation.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
1© 2013 by Nelson Education Ltd. CHAPTER TEN Transfer of Training.
1© 2010 by Nelson Education Ltd. Chapter Ten Transfer of Training.
Chapter 6 Training Evaluation Copyright © 2010 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
1 © 2010 by Nelson Education Ltd. Chapter Four The Needs-Analysis Process.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved Chapter Training Evaluation.
Training Evaluation. Training evaluation Training evaluation provides the data needed to demonstrate that training does provide benefits to the company.
1 © 2013 by Nelson Education Ltd. CHAPTER FOUR The Needs Analysis Process.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
1. Training Discuss how training can contribute to companies’ business strategy. Explain the role of the manager in identifying training needs and.
Training for Improved Performance Chapter 9. LEARNING OBJECTIVES After reading this chapter you should be able to: Explain how employee training practices.
1© 2013 by Nelson Education Ltd. CHAPTER THREE Learning and Motivation.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology Introduction to Industrial/Organizational Psychology by Ronald Riggio.
Chapter Training Evaluation. Objectives After reading this chapter, you should be able to: 1. Explain why evaluation is important. 2. Identify.
Strategy for Human Resource Management Lecture 17 HRM 765.
Scaling and Attitude Measurement in Travel and Hospitality Research Research Methodologies CHAPTER 11.
Managing Learning and Knowledge Capital Human Resource Development: Chapter 11 Evaluation Copyright © 2010 Tilde University Press.
Evaluating HRD Programs Chapter 7 Human Resource Development.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts: Questioning stimulates and guides inquiry Teachers use.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Evaluating Training Programs. How can training programs be evaluated? Measures used in evaluating training programs Measures used in evaluating training.
Training Discuss how training can contribute to companies ’ business strategy. Explain the role of the manager in identifying training needs and supporting.
CHAPTER 1 THE FIELD OF SOCIAL PSYCHOLOGY. CHAPTER OBJECTIVES After reading this chapter, you should be able to: Offer a definition of social psychology.
Evaluation of Training B.V.L.Narayana SPTM/RSC BRC.
Advances in Human Resource Development and Management Course Code: MGT 712 Lecture 28.
Chapter 6 Training and Development in Sport Organizations.
Human Resources Training and Individual Development February 11: Training Evaluation.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 14: Affective Assessment Assessing the Personal Aspects of Students.
Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design.
1-1 Human Resource Management Gaining a Competitive Advantage Chapter 7 Training McGraw-Hill/Irwin Copyright © 2008 by The McGraw-Hill Companies, All Rights.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved Chapter Training Evaluation.
Training Evaluation. Evaluation of Training Since huge sums of money are spent on training and development, how far the programme has been useful must.
Wortham. Assessment in Early Childhood Education, 5e. © 2008 by Pearson Education, Inc. All Rights Reserved. Classroom Assessments Checklists, Rating Scales,
6 - 1 Training Evaluation Introduction (1 of 2) Training effectivenessTraining effectiveness refers to the benefits that the company and the trainees.
Principles of High Quality Assessment Prepared by: Sharon C. Caringal.
Chapter 2 The Process of Experimentation 2.1 Designing and Conducting Agricultural Research.
Training. Training & Development Definition “The systematic acquisition of attitudes, concepts, knowledge, roles, or skills, that result in improved performance.
Chapter 6 - Standardized Measurement and Assessment
Managing Human Resources Chapter PowerPoint Presentation by Monica Belcourt, York University and Charlie Cook, The University of West Alabama Training.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Training and Developing Employees.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
© 2017 SlidePlayer.com Inc. All rights reserved.