INTRODUCTION TO THE S.H.E.A. MODELS FOR TRAINING EVALUATION SMARTRISK LEARNING SERIES December 18 th, 2007 By Dr. Michael P. Shea.

Slides:



Advertisements
Similar presentations
Team Leader Presentation: Collecting Outcomes. Introduction The Changes: Three month outcome form no longer in use Hard outcomes collected by text message.
Advertisements

Providing Feedback to Employees
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Advances in Human Resource Development and Management
Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark.
Managing Learning and Knowledge Capital Human Resource Development: Chapter 11 Evaluation Copyright © 2010 Tilde University Press.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Unit 10: Evaluating Training and Return on Investment 2009.
Kirkpatrick.
Evaluating the Effectiveness of Your Training: Kirkpatrick’s 4 Levels Left Click to move to next slide and begin next audio. PRESS the F5 key to begin.
Iowa’s MCH Data Capacity Assessment Breana Lipscomb Lucia Dhooge Debbie Kane.
Evaluation of Training
N.C. Division of Public Health, Early Intervention Branch, January 2010 Introduction to the Child Outcomes Summary Form (COSF) Professional Development.
6 Chapter Training Evaluation.
Evaluating Training Programs E-learning Assignment Bret (one-t) Painter.
Software Process CS 414 – Software Engineering I Donald J. Bagert Rose-Hulman Institute of Technology December 17, 2002.
Training Needs Analysis
S.M.Israr Training Evaluation By Dr Syed Israr Aga Khan university Karachi, Pakistan.
Evaluating Training Programs. How can training programs be evaluated? Measures used in evaluating training programs Measures used in evaluating training.
Chapter 6 Training Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
How to Develop the Right Research Questions for Program Evaluation
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
9 Closing the Project Teaching Strategies
Performance Technology Dr. James J. Kirk Professor of HRD.
Case Study #9 Jackie Adams: Evaluating a Federally Funded Faculty Training Program Melody Murphy AIL 606 The University of Alabama.
Revalidation Danielle McSeveney Alena Billingsley.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Delaware Birth to Three Early Intervention System Evaluation: Child Outcomes July 15, 2004 Conference Call Series: Measuring Child Outcomes “Examples of.
5 Chapter Training Evaluation.
Evaluating Training Effort Organisations are under pressure to justify various expenses. Business heads and training managers are under pressure to prove.
Kirkpatrick’s Levels of Evaluation
Evaluation and Case Study Review Dr. Lam TECM 5180.
Evaluating HRD Programs
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Evaluation Practices Training providers flourish starting 1992 with the passing of HRD Act Training providers flourish starting 1992 with the passing of.
1 System for Administration, Training, and Educational Resources for NASA Training Evaluations.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Working with Staff to Promote Data-Based Decision Making: Recommended Strategies and Common Pitfalls Dennis H. Reid, Ph.D., BCBA.
Chapter 6 Training Evaluation
1 Monitoring and Evaluating Employee Wellness Programs.
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
Program Evaluation DR. MAJED WADI. Objectives  Design necessary parameters used for program evaluation  Accept different views of program evaluation.
Kirkpatrick’s Four-Level Model of Evaluation
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
Enhancing Education Through Technology Round 8 Competitive.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Program Evaluation Making sure instruction works..
BU610 Applied Business Research 2016 Section A (Waterloo Part-time MBA) Faculty Advisors: Mitali De Shelly Jha Kalyani Menon
© Pearson Education Canada, 2005 Business Communication Essentials, Canadian Edition Chapter Understanding and Planning Business Reports and Proposals.
Training and HRD Process Model
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Evaluating Training The Kirkpatrick Model.
WHAT DO OUR SUPERVISORS AND SUPERVISEES DESERVE NOW? Issues arising from 4 years as a travelling supervision trainer in Australia.
Obtain and review client feedback. Creating evaluation or feedback tools Importance of client feedback  The use of client feedback is very useful to.
Provide instruction.
Janus Project Meeting Vienna
Business Development and Proposal Writing
SLOs: What Are They? Information in this presentation comes from the Fundamentals of Assessment conference led by Dr. Amy Driscoll and sponsored by.
Evaluation of Information Literacy Education
Program Evaluation Essentials-- Part 2
SPIRIT OF HR.in TRAINING EVALUATION.
Education Theory and Practice B ETP520S
6 Chapter Training Evaluation.
Presentation transcript:

INTRODUCTION TO THE S.H.E.A. MODELS FOR TRAINING EVALUATION SMARTRISK LEARNING SERIES December 18 th, 2007 By Dr. Michael P. Shea

2 HISTORY OF TRAINING EVALUATION Kirkpatrick (1959) developed a four-level model of training evaluation Developed for the private sector Now the “industry standard” Used by SMARTRISK since 2003

3 Kirkpatrick’s Model and It’s Four Levels 1)Reaction 2)Learning 3)Behavior 4)Results

4 The S.H.E.A. MODEL”S” Seven-Level Hierarchical Evaluation Assessment Models

5 Added Value of S.H.E.A. (2007) Models Seven Levels versus Four Different Models for Different Contexts (i.e., private sector, public/NFP sectors, CBT/online training) All of our models start with logic models and “implementation” evaluation

6 WHY CONTEXT MATTERS Private sector has more power to control implementation variation Private sector produces standardized training materials and trainers Public/NFP sector has much less budget (e.g., Bell Canada $3,000 per staff)

7 WHY SEVEN LEVELS? Kirkpatrick did not address the issue of implementation Logic models were unknown in 1959 and Kirkpatrick never did address this Results (Kirkpatrick’s Level Four) is much simpler in private sector context

8 TODAY’S FOCUS Public and NFP sectors only Will describe how the seven levels could be used in the context of the CIPC A proposal to the CCCIPC will be submitted for this in early 2008

9 IMPLEMENTATION EVALUATION Cannot assume that implementation will be automatic Critical in multi-site training programs Need a logic model developed with staff See CIPCC example Much more to training than just delivery

10 IMMEDIATE FEEDBACK Should be done before the trainees leave Include some implementation evaluation Use Likert Scales with 5 or 7 points Only ask about what you can change Use this data a.s.a.p. Does not always require a questionnaire

11 KNOWLEDGE & SKILL GAINS Need a pre-test usually One variation is the post-test then pre-test Follow-up is critical Recommend collecting data at least once at three-months follow-up Sometimes longer follow-up is needed

12 APPLICATION OF KNOWLEDGE & SKILLS Three month minimum follow-up Collect data from trainees Online surveys can be very useful Can be done with follow-up knowledge test “Usefulness” is key If not useful …

13 BENEFITS TO TRAINEES’ ORGANIZATIONS Some of this data can be collected at follow-up Six months or more is recommended Need to collect data from trainees’ supervisors at minimum Use 360 degree method if possible

14 CONTRIBUTION TO CLIENT OUTCOMES In our context we “contribute” in small ways to long-term outcomes This is the “attribution” question (Mayne) Logic model is critical here We can contribute only Multiple sources of “explained” variance

15 COSTS & BENEFITS Not as important in NFP or Public Sector However, we do need to document the “hidden costs” of training Need to quantify benefits when and where ever possible

16 SUMMARY A lot has changed since 1959 Context in which training is done matters Implementation is not automatic Logic models are key Follow-up is essential Results are multi-faceted