Presentation is loading. Please wait.

Presentation is loading. Please wait.

Part 2. Evaluation FORMATIVE SUMMATIVE Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills.

Similar presentations


Presentation on theme: "Part 2. Evaluation FORMATIVE SUMMATIVE Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills."— Presentation transcript:

1 Part 2

2 Evaluation FORMATIVE SUMMATIVE

3 Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills pressure to contribute

4 Why Evaluate ? Considers peer pressure Provides self satisfaction Relies on more information now available Supports professionalism Meets your need for Survival

5 Results-Based Approach 1.Requires tangible results that can be measured 2.Includes at least one method of evaluations 3.Needs someone to have responsibility for evaluation; all PIT staff should be involved 4.Encourages management to be involved in intervention 5.Supports a proactive increase in management commitment

6 Results-Based Approach 6.Needs a systematic measurement and evaluation plan in place 7.Needs participants to understand their role to achieve results 8.Works best when interventions are connected to strategic initiatives

7 Exercise # 1 Briefly identify what you would do as a PIT professional to ensure the results-based approach was utilized in any type of HRD intervention. (Handout 2)

8 Purposes and Uses of Evaluations 1.Asks: Are we accomplishing program objectives? 2.Identifies strengths and weakness of interventions (Effectiveness and efficiency) 3.Compares costs to the benefits 4.Helps answer who should participate in and future interventions 5.Tests the clarity and validity of tests, cases, and exercises used in the intervention

9 Purposes and Uses of Evaluations 6.Identifies which participants were most successful with the intervention 7.Gathers data for marketing future programs 8.Determines: Was the intervention appropriate solution for the identified gap in performance 9.Establishes database for future decision making concerning interventions (Becomes best practices)

10 Exercise # 2 List 2-3 reasons why you think these elements are necessary in the overall purpose/use of evaluation for HRD processes. (Handout 3)

11 Last Week  Why evaluate  Results-based approach  Purposes and uses of evaluation  Reviewed Kirkpatrick model

12 This Week  More on Kirkpatrick model  Types of data collection  Types of data  Evaluation instruments  Tips for Survey/Questionnaire * Will vary from note taking H.O.

13 Warm Up Exercise F

14 Levels of Evaluations Donald Kirkpatrick - 4 Levels I. Reaction – Were the participants pleased with intervention II. Learning – What did the participants learn III. Behavior – Did the participants change their behavior based on what was learning? IV. Results – Did the change in behavior positively affect the organization

15 Exercise # 3 What are some advantages and limitations in each of the four levels of Kirkpatrick’s evaluation model? (Use note-taking handout)

16 Collect Post Intervention Data (Handout 3) 1. Surveys 2. Questionnaires 3. On the job observation 4. Post intervention interviews 5. Focus groups 6. Program assignments 7. Action plans 8. Performance contracts 9. Follow up session 10. Performance monitoring

17 Exercise # 4 Using the 10 items in the Post Intervention Data handout (3): Each group describe (fabricate) a situation where you might gather post intervention data for a level 2/3/4 evaluation. Be prepared to explain your rationale.

18 Hard Data Hard data can be group into four categories 1. Output – of the work unit 2. Quality – how well produced or serviced 3. Cost – improvement in costs 4. Time – savings

19 Soft Data Work habits – absenteeism, tardiness, violations Climate – number of grievances, complaints, job satisfaction Satisfaction – Favorable reactions, employee loyalty, increased confidence

20 Exercise # 5 List some advantages and disadvantages / limitations when collecting hard and soft data. (Use Notetaking H.O. p. 3/4)

21 Evaluation Instruments Validity – does it measure what it is supposed to measure Content validity – how well does the instrument measure the content/objectives of the program Construct validity –How well does it measure the construct (abstract variable such as KSA) Concurrent validity – How well does the instrument measure up against other instruments Predictive validity – how well can it predict future behavior

22 Part 3

23 Last Week  More on Kirkpatrick model  Types of data collection  Types of data

24 This Week  Developing evaluation instruments  The survey process * New Handouts

25 Exercise # 5 List some five things you would do to enhance the chances of getting a good number returns for surveys/ questionnaires.

26 Survey Process -- Tips Communicate –in advance –the purpose Signed introductory letter Explain who’ll see data Use anonymous input?

27 More Tips Keep it simple Simplify the response process –Bubble format –SASE Utilize local support

28 More Tips Consider incentives Use follow-up reminders Send a copy of the results to the participants

29 Action Planning Most common type of follow-up assignments. Developed by participants. Contains detailed steps to accomplish measurable objectives. Shows what is to be done, by whom, when. Must be monitored

30 Action Plans Communicate the action plan requirement early and explain its value (avoids resistance) Describe the action-planning process at the beginning of the program (outline) Teach the action-planning process Allow time to develop the plan Have the facilitator approve the action plans Require participants to assign a monetary value for each improvement (helps ROI later)

31 Action Plans Ask participants to isolate the effects of the program Ask participants to provide a confidence level for estimates Require action plans to be presented to the groups by participants (peer review) if possible Explain the follow up mechanism Collect action plans Summarize the data and calculate ROI

32 Converting Data to Monetary Benefits Focus on a unit of measure Determine the value of each unit Calculate the change in performance Determine an annual amount for the change Calculate the total value of the improvement

33 Ways to Put Value on Units Cost of quality Converting Employee time Using Historical Costs Using Internal and External Experts External Databases Estimates from the participants Estimates from Supervisors Estimates from Senior Managers Using HRD staff estimates

34 Credibility Source of the Data Source for the study Motives of evaluators Methodology of the study Assumptions made in the analysis Realism of the outcome data Types of data Scope of analysis

35 Guidelines for Study Credible and reliable sources for estimates Present material in an unbiased, objective way Fully explain methods (step by step) Define assumptions and compare to other studies Consider factoring or adjusting output values when they appear unrealistic Use hard data whenever possible

36 Identifying intangible Measures (Not based upon monetary values) Employee satisfaction Stress reduction Employee turnover Customer satisfaction, retention Team effectiveness

37 Determining Costs Collect costs on every intervention Costs will not be precise (hard to be perfect) Be practical - work with accounting department Define which costs to collect, categories, sources Computerize Cost accumulation (track accounts) Cost estimation (forumulas - page 227) Fully load with all costs possible - be truthful Overhead, benefits, perpherial costs, etc

38 Data Analysis Statistics (use a professional) Use terms appropriately (ie, Significant difference) Statistical deception (erroneous conclusions)

39 Return on Investment Compares costs to benefits Complicated Usually annualized Business case specific Communicate the formula used

40 I. Reaction and Planned Action – measure’s participants reactions and plans to change II. Learning – Measures KSA III. Job Applications – Measures change of behavior on the job and specific use of the training material IV. Business results – Measures impact of the program V. Return on investments – Measures the monetary value of the results and costs for the program, usually expressed as a percentage Phillips ROI Framework

41 Level 1ReactionParticipants Level 2LearningParticipants Level 3Job Applications Immediate Managers Level 4Business Impact Immediate/Senior Managers Level 5Return on Investment Senior Managers Executives Evaluation as a Customer Satisfaction Tool

42 From Level 4 to Level 5 Requires Three Steps: 1. Level 4 data must be converted to monetary values 2. Cost of the intervention must be tabulated 3. Calculate the formula

43 ROI Process Model Collect Data Isolate Effects of Training, Convert Data to Monetary Valve Calculate the Return on Investment Identify Intangible Benefits Tabulate Program Costs

44 ROI Formula ROI (%) = Net Program Benefits Program Costs X 100

45 Two Methods 1. Cost/Benefit Ratio An early model that compares the intervention’s costs to its benefits in a ratio form. For every one dollar invested in the intervention, X dollars in benefits were returned. 2. ROI Formula Uses net program benefits divided by costs, and expressed as a percent.

46 Cost / Benefit CBR = Program Benefits Program Costs

47 ROI Formula ROI (%) = Net Program Benefits Program Costs X 100

48 Cautions With Using ROI Make sure needs assessment has been completed Include one or more strategies for isolating the effects of training Use the reliable, credible sources in making estimates Be conservative when developing benefits and costs Use caution when comparing the ROI in training and development with other financial returns Involve management in developing the return Approach sensitive and controversial issues carefully Do not boast about a high return (internal politics)

49 Implementation Issues Identify an internal champion (cheerleader) Develop an implementation leader Assign responsibilities so everyone will know their assigned tasks and outcomes Set targets (annual) Develop a project plan, timetable Revise/Develop Policies and Procedures (Page 367) Assess the climate – gap analysis, SWOT, barriers

50 Preparing Your Staff Involve the staff in the process Using Evaluation Data as a Learning Tool Identify and remove obstacles (complex, time, motivation, correct use of results)

51 ROI Administration  Which programs to select? Large target audiences Important to corporate strategies Expensive High Visibility Comprehensive needs assessment

52 ROI Administration Reporting Progress Status meetings (facilitated by expert) Report progress Add evaluation areas Establish Discussion Groups Train the Management Tool

53 Timing of Evaluation 1.During the program 2.Time series – multiple measures 3. Post tests – timing

54 Questionnaire Content Issues Progress with objectives Action plan status Relevance of intervention Use of program materials Knowledge/skill application Skill frequency Changes in the work unit Measurable improvements/ accomplishments Monetary impact Confidence level Improvement linked with the intervention Investment perception Linkage with output measures Barriers Enablers Management support Other solutions Target audience recommendations Suggestions for improvement


Download ppt "Part 2. Evaluation FORMATIVE SUMMATIVE Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills."

Similar presentations


Ads by Google