Evaluation and Case Study Review Dr. Lam TECM 5180.

Slides:



Advertisements
Similar presentations
SNDT Women's University Introduction to Evaluation and Assessments  Presented by Kathleen (Kat) Miller, Senior Consultant with Booz Allen Hamilton  4.
Advertisements

SEM A – Marketing Information Management
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Using Surveys for Assessment Best Practices in Assessment February 9, 2012.
Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark.
Questionnaire Surveys Obtaining data by asking people questions and recording their answers Obtaining data by asking people questions and recording their.
Training Objectives.
CS305: HCI in SW Development Evaluation (Return to…)
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
© Development Dimensions Int’l, Inc., MMIII. All rights reserved. 1 Jim Thomas, Ph.D. Manager, Consulting Services January 17, 2003 Competency-based Selection.
Unit 10: Evaluating Training and Return on Investment 2009.
Effective Training: Strategies, Systems and Practices, 2 nd Edition Chapter Eight Evaluation of Training.
An evaluation framework
6 Chapter Training Evaluation.
Evaluating Training Programs. How can training programs be evaluated? Measures used in evaluating training programs Measures used in evaluating training.
Chapter 6 Training Evaluation
Instructional Design Dr. Lam TECM 5180.
Quantitative Research
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Performance Management 2 MANA 3320
Developing Evaluation Instruments
Business and Management Research
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Questionnaires and Interviews
Determining System Requirements Classes 9,10. SDLC Project Identification & Selection Project Initiation & Planning Analysis ** Logical Design Physical.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
INTRODUCTION TO THE S.H.E.A. MODELS FOR TRAINING EVALUATION SMARTRISK LEARNING SERIES December 18 th, 2007 By Dr. Michael P. Shea.
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Chapter 4- slide 1 Copyright © 2009 Pearson Education, Inc. Publishing as Prentice Hall Chapter Four Managing Marketing Information to Gain Customer Insights.
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
Evaluating Training Effort Organisations are under pressure to justify various expenses. Business heads and training managers are under pressure to prove.
BIS 360 – Lecture Five Ch. 7: Determining System Requirements.
1 Collecting primary data: questionnaires Week 7 lecture 2.
Kirkpatrick’s Levels of Evaluation
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Evaluating HRD Programs
Chapter 12 Survey Research.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
Chapter 6 Training Evaluation
Spring 2012 Morinchin. 1. Attitudes - How well do you think our committee has accomplished its objectives?
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
1 Research Paper Writing Mavis Shang 97 年度第二學期 Section VI.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
3.04 Interpret marketing information to test hypotheses and/ or solve issues Marketing Management.
Chapter 14: Affective Assessment
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Fashion MARKETING TID1131. Types of research Quantitative research Information relating to numbers – quantity. Method - surveys Qualitative research To.
HHHR Objective Setting and Performance Appraisal Cycle for Supervisors.
CHAPTER 4 – EVALUATION GRADING AND BACKWASH Presenter: Diane Whaley.
Evaluating Training The Kirkpatrick Model.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Evidence Based Practice & Research in Nursing Level 8, Academic Year (AY) 1434—1435 H Vanessa B. Varona, RN, MAN.
Data Collection Techniques
Educational Communication & E-learning
Designing Questionnaire
Training Trainers and Educators Unit 8 – How to Evaluate
Evaluation of Information Literacy Education
Business and Management Research
Chapter Six Training Evaluation.
Training Trainers and Educators Unit 8 – How to Evaluate
Business and Management Research
Presentation transcript:

Evaluation and Case Study Review Dr. Lam TECM 5180

Summative Evaluation vs. Formative Evaluation Summative evaluation- Assessment OF learning Formative evaluation- Assessment FOR learning

Summative or Formative? You have been asked to to determine how much money your training program has saved over a six month period. You asked trainees to complete a 50-question paper and pencil exam covering the learning objectives of your course. You have been asked to observe trainees doing their jobs and write a report that describes their level of knowledge transfer. You asked trainees for feedback about the content and delivery of the course and the facilitator. After six months, you have ed managers and asked them about each trainees performance.

Types of Evaluation Lots of models of evaluation Kirkpatrick’s four models of evaluation Stufflebeam’s four step evaluation process Rossi’s five-domain evaluation model Brinkerhoff’s success case model We’ll talk about Kirkpatrick (1994) because: It’s widely accepted It’s easy to grasp

Reactions What? : Perceptions of the trainees How?: Questionnaire's and feedback forms Why?: Gives designer’s insight about training satisfaction, which can be good and bad. Trainee feedback is relatively quick and easy to obtain; it is not typically very expensive to analyze

Evaluation instrument examples See Piskurich pages

Questionnaires Open-ended items - allows users to express opinions in their own words Advantages: allows users to give unique, open, and honest feedback Disadvantages: difficult to analyze; trainees often prefer not to fill them out (biased results) Close-ended items - allows users to express opinions on a predetermined quantitative scale Advantages: easy to analyze; fast completion for trainees Disadvantages: inhibits unique feedback; doesn’t always provide a full picture

Creating close-ended questions Use a scale that allows for degrees of comparison Not good: Did you find the course beneficial? Yes or No Better: On a scale from 1 to 5, how beneficial did you find the course? Always use the same scale (e.g., 5-point or 7-point likert scale) Construct questions grammatically consistent Develop questions for specific purposes (i.e., don’t ask questions if you don’t know what you’ll do with the result)

Creating open-ended questions Limit your use of these Use them to supplement close-ended responses Reserve these for unique responses Bad use of open-ended : What did like about the presentation slides? Improved : On a scale from 1 to 5, how useful were like slides in supplementing the facilitator’s content? Bad use of close-ended: Rate the following on a scale of 1 to 5 with 1 being strongly disagree and 5 being strongly agree: I would make changes to the delivery of this course. Improved: What changes would you make to the delivery of the course?

Learning What?: Measure of increase in knowledge before and after training How?: Formal assessment; interview or observation Why?: To ensure your trainee’s have learned what you set out for them to learn Already created if you’ve designed and developed your course properly (See last week’s presentation slides for assessment overview)

Behavior What?: The extent of applied learning back on the job How?: Observation and interviews over time; retesting Why?: To measure the long-term efficacy of your training program Measuring behavior is difficult and requires the cooperation of management and other overseers of day-to-day operations of a trainee

Piskurich’s “Transfer to the job” Evaluation 1.Did the training address the requirements of the job? 2.Were the trainees performing the job requirements competently before the training? 3.Are the trainees now performing the job requirements competently? 4.What are the trainees still not doing correctly? 5.Were there any unintended consequences of the training?

Examples See Piskurich pages

Results What?: The effect on the business or environment of the trainee How?: Measured with already implemented systems; ROI; Cost-effectiveness analysis; Why?: To measure the impact training has on organization (on a macro-level) Difficult to isolate training as a variable

ROI Return-on-investment Use ROI to: demonstrate effectiveness; promote importance; suggest refinements; project future costs; measure success Drawbacks to ROI: can’t compute intangible benefits; can’t measure all variables and data; can be misleading

How to calculate ROI ROI = Net Benefits/Cost Net benefits can include: increased productivity; greater customer satisfaction; higher quality work product Costs can include: design and development costs; ongoing costs; evaluation costs The hard part is quantifying benefits and costs Although ROI is a quantifiable metric, there is an interpretative element in calculating ROI Therefore, your logic and rationale behind the metric is as important as the metric itself

How to determine what evaluations to conduct Why do I want to evaluate? What am I going to evaluate? Who should I involve as part of the evaluation? How am I going to do the evaluation? When should I do the evaluation?

Implementing Revisions As-needed revisions- most common type of revision; reactionary Planned revisions- less common type of revision; proactive