Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation and Case Study Review Dr. Lam TECM 5180.

Similar presentations


Presentation on theme: "Evaluation and Case Study Review Dr. Lam TECM 5180."— Presentation transcript:

1 Evaluation and Case Study Review Dr. Lam TECM 5180

2 Summative Evaluation vs. Formative Evaluation Summative evaluation- Assessment OF learning Formative evaluation- Assessment FOR learning

3 Summative or Formative? You have been asked to to determine how much money your training program has saved over a six month period. You asked trainees to complete a 50-question paper and pencil exam covering the learning objectives of your course. You have been asked to observe trainees doing their jobs and write a report that describes their level of knowledge transfer. You asked trainees for feedback about the content and delivery of the course and the facilitator. After six months, you have emailed managers and asked them about each trainees performance.

4 Types of Evaluation Lots of models of evaluation Kirkpatrick’s four models of evaluation Stufflebeam’s four step evaluation process Rossi’s five-domain evaluation model Brinkerhoff’s success case model We’ll talk about Kirkpatrick (1994) because: It’s widely accepted It’s easy to grasp

5

6 Reactions What? : Perceptions of the trainees How?: Questionnaire's and feedback forms Why?: Gives designer’s insight about training satisfaction, which can be good and bad. Trainee feedback is relatively quick and easy to obtain; it is not typically very expensive to analyze

7 Evaluation instrument examples See Piskurich pages 274-275

8 Questionnaires Open-ended items - allows users to express opinions in their own words Advantages: allows users to give unique, open, and honest feedback Disadvantages: difficult to analyze; trainees often prefer not to fill them out (biased results) Close-ended items - allows users to express opinions on a predetermined quantitative scale Advantages: easy to analyze; fast completion for trainees Disadvantages: inhibits unique feedback; doesn’t always provide a full picture

9 Creating close-ended questions Use a scale that allows for degrees of comparison Not good: Did you find the course beneficial? Yes or No Better: On a scale from 1 to 5, how beneficial did you find the course? Always use the same scale (e.g., 5-point or 7-point likert scale) Construct questions grammatically consistent Develop questions for specific purposes (i.e., don’t ask questions if you don’t know what you’ll do with the result)

10 Creating open-ended questions Limit your use of these Use them to supplement close-ended responses Reserve these for unique responses Bad use of open-ended : What did like about the presentation slides? Improved : On a scale from 1 to 5, how useful were like slides in supplementing the facilitator’s content? Bad use of close-ended: Rate the following on a scale of 1 to 5 with 1 being strongly disagree and 5 being strongly agree: I would make changes to the delivery of this course. Improved: What changes would you make to the delivery of the course?

11 Learning What?: Measure of increase in knowledge before and after training How?: Formal assessment; interview or observation Why?: To ensure your trainee’s have learned what you set out for them to learn Already created if you’ve designed and developed your course properly (See last week’s presentation slides for assessment overview)

12 Behavior What?: The extent of applied learning back on the job How?: Observation and interviews over time; retesting Why?: To measure the long-term efficacy of your training program Measuring behavior is difficult and requires the cooperation of management and other overseers of day-to-day operations of a trainee

13 Piskurich’s “Transfer to the job” Evaluation 1.Did the training address the requirements of the job? 2.Were the trainees performing the job requirements competently before the training? 3.Are the trainees now performing the job requirements competently? 4.What are the trainees still not doing correctly? 5.Were there any unintended consequences of the training?

14 Examples See Piskurich pages 278-279

15 Results What?: The effect on the business or environment of the trainee How?: Measured with already implemented systems; ROI; Cost-effectiveness analysis; Why?: To measure the impact training has on organization (on a macro-level) Difficult to isolate training as a variable

16 ROI Return-on-investment Use ROI to: demonstrate effectiveness; promote importance; suggest refinements; project future costs; measure success Drawbacks to ROI: can’t compute intangible benefits; can’t measure all variables and data; can be misleading

17 How to calculate ROI ROI = Net Benefits/Cost Net benefits can include: increased productivity; greater customer satisfaction; higher quality work product Costs can include: design and development costs; ongoing costs; evaluation costs The hard part is quantifying benefits and costs Although ROI is a quantifiable metric, there is an interpretative element in calculating ROI Therefore, your logic and rationale behind the metric is as important as the metric itself

18 How to determine what evaluations to conduct Why do I want to evaluate? What am I going to evaluate? Who should I involve as part of the evaluation? How am I going to do the evaluation? When should I do the evaluation?

19 Implementing Revisions As-needed revisions- most common type of revision; reactionary Planned revisions- less common type of revision; proactive


Download ppt "Evaluation and Case Study Review Dr. Lam TECM 5180."

Similar presentations


Ads by Google