Download presentation
Presentation is loading. Please wait.
Published byAngel Dawson Modified over 8 years ago
1
Building Capability and Expertise with ROI Implementations
ROI Certification Building Capability and Expertise with ROI Implementations Jack J. Phillips, Ph. D. Patti P. Phillips, Ph. D.
2
Reaction Objectives Provide participants knowledge and skills that are: Relevant to their job Important to their current job success Immediately applicable New to their understanding of accountability Relevant to their colleagues in similar job situation
3
Learning Objectives Enable participants to:
Describe the five critical components of a successful evaluation practice Describe the five levels of evaluation Describe the six types of data in the chain of impact Describe the ten steps in the ROI Methodology and . . .
4
Learning Objectives Follow the 12 guiding principles
Plan and execute an ROI evaluation project Calculate and explain the difference in the benefit- cost ratio (BCR) and the return on investment (ROI) Communicate the results of an ROI study to a variety of stakeholders Implement the ROI Methodology within their organization
5
Application Objectives
Support participants as they: Build support for the ROI Methodology in their organization Complete their initial ROI evaluation project Plan and implement future ROI projects Revise/update internal evaluation strategy/practice Brief/teach other in the ROI Methodology Change the way the propose, implement, and evaluation programs, processes and initiatives
6
Impact Objectives Enable participants to realize positive consequences as a result of applying what they learn such as: Improving program effectiveness Improving program efficiencies Expanding successful programs Redesigning or discontinuing ineffective programs Improving relationships with clients and executives Enhancing the influence of their function within the organization
7
Setting the Stage The ROI Methodology Implementing ROI
Planning Evaluation Implementing ROI Converting Data to Money Collecting Data Tabulating Costs and Calculating ROI Reporting Results Isolating the Effects of the Program Forecasting ROI
8
Program Success will be Measured by:
Ratings achieved on end of course evaluation Increase in knowledge gain as reported on end- of course evaluation Demonstration of knowledge through: Course exercises Case study presentations ROI project plan presentation ROI implementation plan and . . .
9
Program Success will be Measured by:
ROI project completion following ROI Methodology steps and guiding principles Evaluation planning Data collection Data analysis Report submittal Steps toward implementing (beyond ROI project) completed as planned.
10
Roles of the ROI Implementation Leader
Technical Expert Consultant Problem Solver Initiator Designer Developer Coordinator Cheerleader Communicator Process Monitor Planner Analyst Interpreter Teacher
11
Skill Areas for Certification
Planning for ROI calculations Collecting evaluation data Isolating the effects of solutions Converting data to monetary values Monitoring program costs Analyzing data including calculating the ROI Presenting evaluation data Implementing the ROI process Providing internal consulting on ROI Teaching others the ROI process
12
Certification Projects
Item Due Date Case Study Presentation During Workshop Implementation Plan for the ROI Process End of Workshop ROI Project Plan Implementation complete 3-6 Months ROI project complete 6 Months, Ideally
13
Case Study Presentation
Team based assignment Present results to executive audience (you own the study) Q and A session Critique the case study (you don’t own the study)
14
Ideally, complete within 6 months
ROI Project Based on a planned or anticipated ROI impact study Individual or team based Provide a copy of Data collection Plan during workshop Provide a copy of ROI Analysis Plan during workshop Ask for input from the group Ideally, complete within 6 months
15
Implementation Plan Requirements
Specific Motivational Achievable Realistic Time-based Must be within your control!
16
Global Communications
17
Paradigm Shift in Programs
Activity-Based Results-Based Characterized by: Characterized by: no business need for the program no assessment of performance issues no specific measurable objectives no effort to prepare program participants to achieve results program linked to specific business needs assessment of performance effectiveness specific objectives for application and business impact results expectations communicated to participants
18
Paradigm Shift Activity-Based Results-Based Characterized by: Characterized by: no effort to prepare the work environment to support transfer no efforts to build partnerships with key managers no measurement of results or cost benefit analysis reporting on programs is input focused environment prepared to support transfer partnerships established with key managers and clients measurement of results and cost benefit analysis (ROI) reporting on programs is output focused
19
Definition of Results-Based Programs
Programs are initiated, developed, and delivered with the end in mind. A comprehensive measurement and evaluation system is in place for each program. Impact and ROI evaluations are regularly developed. Program participants understand their responsibility to obtain results with programs. Support groups help to achieve results from training.
20
How Results-Based Are Your Programs?
Take the assessment entitled “How Results-Based Are Your Programs?” When taking this assessment, try to be candid in selecting the appropriate response. Score your assessment using the guidelines provided. Compare your scores with others. What is considered to be an adequate score? What are the potential uses of this survey?
21
Human Capital Perspectives
Traditional View Emerging View Expenses are considered costs Expenditures are viewed as a source of value Function is perceived as a support staff Function is perceived as a strategic partner Involved in setting HR budget Top executives involved in budget Metrics focus on cost and activities Metrics focus on results Metrics created and maintained by HR alone Top executives involved in metrics design and use . . . and
22
Human Capital Perspectives
Traditional View Emerging View Little effort to understand the ROI in HC ROI has become an important tool Measurement focuses on the data at hand Measurement focuses on the data needed Measurement is based on what others measure Measurement is based on organization needs Programs initiated without a business need Programs linked to specific business needs Reporting is input-focused Reporting is output-focused
23
Increased Interest in the
Value of Human Capital DRIVERS: The increasing cost of human capital Consequences of improper or ineffective HR practices Linkage of human capital to strategic initiatives Increased accountability of all functions Top executive requirement for HR contribution, and human capital ROI
24
ROI Profitability Vital Signs Effectiveness Benefits vs Costs
Balanced Scorecard PROGRAM IMPACT BOTTOM LINE CONTRIBUTION ROI Strategic Accountability Evaluation Value Based PERFORMANCE STANDARDS Effectiveness Vital Signs ECONOMIC VALUE ADDED Shareholder Value Benefits vs Costs
25
Three Journeys The need to change the HR measurement mix
Setting the investment level for human capital Valuing human capital Each is explored next…
26
Apex, Inc.
27
Measuring the HR Contribution: Status
Comparison Of Approaches To Measure The HR Contribution Measuring the HR Contribution: Status
28
HR Accountability Progress
ROI Methodology HR Profit Center Leading Edge Approaches Balanced Scorecard HR Macro Studies Human Capital Measurement Solid Value-Added Approaches Competitive HR Benchmarking HR Accountability HR Satisfaction Surveys HR Cost Monitoring HR Key Indicators HR Auditing Approaches Early HR Case Studies Feedback Surveys MBO in Personnel 1960’s 1970’s 1980’s 1990’s 2000
29
Leading Edge Approaches to Measuring the HR Contribution
Balanced Scorecard HR Profit Center Human Capital Measures HR Macro Studies ROI Process Most promise as an immediate tool
30
Recommendations for Measurement Categories
Select an approach in each of these categories: Attitudinal Data Comparative Data Human Capital Measures Benefit/Cost Analysis (ROI) Notes:________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
31
Common Human Capital Measures
Innovation and Creativity Employee Attitudes Workforce Stability Employee Capability Human Capital Investment Leadership Productivity Workforce Profile Job Creation and Recruitment Compensation and Benefits Compliance and Safety Employee Relations
32
Setting the Investment Level
1 Let Others Do It!
33
Motivating Forces Approaches Cost control Lack of infrastructure
Instability Access to expertise Short-term focus Survival Hire fully competent employees Use contract employees Outsource major functions
34
Setting the Investment Level
2 Invest the Minimum!
35
Motivating Forces Approaches Low cost industry High labor use
Strong competition Employees are dispensable Pay minimum wages Provide few benefits Keep training simple Expect turnover and address it
36
Human Resources Development Issues
Training Job-related Skills Focus Costs per Employee Low Risk for Payback Time for Payback Short Risk for Payback Low
37
Human Resources Development Issues
Education Preparation for the next job Focus Costs per Employee Moderate Risk for Payback Time for Payback Medium Risk for Payback Moderate
38
Human Resources Development Issues
Cultural change and continuous learning Focus Costs per Employee High Risk for Payback Time for Payback Long Risk for Payback High
39
Setting the Investment Level
3 Invest with the Rest!
40
Motivating Forces Approaches Desire to have best practices
Benchmarking is acceptable Benchmarking is used in all parts of organization Benchmarking can be low cost Benchmarking is low risk Locate existing reports Participate in existing projects Create a custom project Search the literature
41
Human Capital Investment Benchmarks
Human Resource Expenses (HR Department Costs/Budget Total Investment in Human Capital (Total HR expenses plus all salaries and benefits of non-HR staff HR Expenses by Function HR Expenses by Process/Programming Selected HR Costs
42
Phases of the Benchmarking Process
1 Determining What to Benchmark 7 Initiating Improvement from Benchmarking 2 Building the Benchmarking Team Phases of the Benchmarking Process 6 Distributing Information to Benchmarking Partners 3 Identifying Benchmark Partners 4 Collecting Benchmarking Data 5 Analyzing the Data
43
Setting the Investment Level
4 Invest Until it Hurts!
44
Motivating Forces Approaches Fad chasing Happy employee dilemma
Quick fixes Retention concerns Competitive strategy Union demands We can afford it! Pay above-market wages Provide above-market employee benefits Implement most new fads/ programs Provide all types of employee services
45
The Relationship Between Over-Investing and Performance
Under-Investing Over-Investing Optimal Investment in Human Capital
46
Setting the Investment Level
5 Invest as Long as There is a Payoff!
47
Motivating Forces Approaches Need to show HR contribution
Increasing cost of human capital Secure funding Business partner Improve processes Measure success of each HR program Collect up to six types of data Use ROI routinely Involve stakeholders Use the data
48
The ROI Methodology Evaluation Data Collection Planning Develop
LEVEL 1: REACTION AND PLANNED ACTIONS LEVEL 3: APPLICATION AND IMPLEMENTATION Develop Objective of Solution(s) Develop Evaluation Plans and Baseline Data Collect Data During Solution Implementation Collect Data After Implementation LEVEL 2: LEARNING AND CONFIDENCE LEVEL 4: BUSINESS IMPACT
49
Data Analysis Reporting
Tabulate Costs of Solution Data Analysis Reporting Generate Impact Study Isolate the Effects Convert Data To Monetary Value Calculate the Return On Investment LEVEL 5: ROI Identify Intangible Measures INTANGIBLE BENEFITS
50
Methodical Development
Training and Learning Organization Development HR Programs Change Initiatives Technology Implementation Quality / Six Sigma Meetings and Events Coaching
51
Valuing Human Capital: Three Approaches
What we know from Logic and Intuition What we know from Macro Level Research What we know from ROI Analysis
52
1. Logic and Intuition Automation has limitations People are necessary
Stock market mystery Accounting dilemma Last source of competitive advantage Superstar Phenomena
53
Superstar Characteristics
People are the difference Good and great Great places to work Most admired companies
54
2. Macro Level Research HR Effectiveness Index Gallup Studies
The Service Profit Chain Watson-Wyatt Studies Deloitte & Touche Studies and many others
55
3. ROI Analysis Micro Analysis Tool 5,000 studies per year
Over 40 Countries / 25 Languages Variety of Applications ROI Certification ROI Networks ROI Standards ROI Best Practices
56
Valuing Human Capital The Complete Picture
Micro Analysis (ROI Studies) Macro Analysis (Relationships) Logic & Intuition (Intangibles)
57
Reliance Insurance Company
58
Matching Evaluation Levels with Objectives
Level 1: Reaction Level 2: Learning Level 3: Application Level 4: Business Impact Level 5: Return on Investment
59
Measurement in Learning and HR
Level Measurement Category Current Status* Coverage (Now) (%) Goal in Years Coverage (Goal) (%) Comments About Status O Inputs/Indicators Measures the number of programs, participants audience, costs, and efficiencies 100% This is being accomplished now 1 Reaction and Planned Action Measures reaction to, and satisfaction with, the experience, contents, and value of program Need more focus on content and perceived value
60
Measurement in Learning and HR
Level Measurement Category Current Status* Coverage (Now) (%) Goal in Years Coverage (Goal) (%) Comments About Status 2 Learning Measures what participants learned in the program – information, knowledge, skills, and contacts (takes-away from the program) 30 – 40% 80 – 90% Must use simple learning measures 3 Application Measures progress after the program – the use of information, knowledge, skills, and contacts 10% 30% Need more follow-up
61
Measurement in Learning and HR
Level Measurement Category Current Status* Coverage (Now) (%) Goal in Years Coverage (Goal) (%) Comments About Status 4 Business Impact Measures changes in business impact variables such as output, quality, time, and costs linked to the program 5% 10% This is the connection to business impact 5 ROI Compares the monetary benefits of the business impact measures to the costs of the program. 1% The ultimate level of evaluation
62
The Results Reacted very positively to the program and found it to be very relevant to their work; Learned new skills and gained new insights about themselves; Utilized the skills and insights routinely with their teams, although they had some difficulty in a few areas; Improved several important work unit measures, with some measures improving as much as 28%; Achieved an impressive 105% return on investment; and Reported an increase in job satisfaction in the work unit.
63
Key Issues with This Level of Analysis
Objectives? Credibility of data? Source of data? Consistent methodology? Scope? Standards? Use of data? Cost of process? Fear of data?
64
…..and includes a technique to isolate the effects of the program
The ROI Process …Generates six types of data: Reaction to a project or program Learning skills/knowledge Application/Implementation progress Business impact related to the project or program Return on Investment Intangible Benefits …..and includes a technique to isolate the effects of the program
65
ROI by the Numbers Process refined over a 25-year period
5,000 impact studies conducted each year 100 case studies published on ROI 3,000 individuals certified to implement the ROI Methodology 15 ROI books developed to support the process 600 member professional network formed to share information ROI methodology adopted by over 2,000 organizations in manufacturing, service, non-profit, and government settings in over 40 countries
66
ROI Dilemma Why the gap? 70-80% of organizations want to use ROI
Use List ----- ROI Wish List ROI ----- HIGH 15-20% of organizations are currently using ROI LOW Why the gap?
67
Why Use Impact and ROI Analysis?
Reactive Show contributions of selected programs Justify/defend budgets Identify inefficient programs that need to be redesigned or eliminated
68
Why Use Impact and ROI Analysis?
Proactive Aligns programs to business needs Earn respect of senior management/administrators Improve support for programs Enhance design and implementation processes Identify successful programs that can be implemented in other areas
69
Applications Learning and Development Career Development
Competency Systems Diversity Programs E-Learning Executive Coaching Gainsharing Meetings and Events Leadership Development Organization Development Orientation Systems Recruiting Strategies Safety & Health Programs Self-Directed Teams Skill-Based/Knowledge-Based Compensation Technology Implementation Quality Programs Wellness/Fitness Initiatives
70
Basic Elements An Evaluation Case Applications Framework and Practice
Implementation Operating Standards and Philosophy A Process Model
71
Evaluation Framework Level Measurement Focus
1. Reaction & Planned Action Measures participant satisfaction and captures planned actions, if appropriate 2. Learning & Confidence Measures changes in knowledge, skills, and attitudes related 3. Application & Implementation Measures changes in on-the-job behavior or actions 4. Business Impact Measures changes in business impact variables 5. Return on Investment Compares project benefits to the costs
72
Defining the Return on Investment
Monetary Benefits Program Costs Benefits/Cost Ratio = Net Monetary Benefits Program Costs ROI = X 100
73
ROI Example Costs for project $80,000 Benefits from project $240,000
BCR = ROI = x = % 3.0 $160,000 $80,000 200
74
ROI Target Options Set the value at the same level as other investments, e.g. 15% Set slightly above other investments, e.g. 25% Set at break even - 0% Set at client expectations Private sector organizations usually go with option #2; public sector usually prefer option #3.
75
Characteristics of Evaluation Levels
Chain of Value of Customer Frequency Difficulty of Impact Information Focus of Use Assessment Satisfaction Lowest Consumer Frequent Easy Learning Application Impact ROI Highest Client Infrequent Difficult Customers Consumers: The customers who are actively involved in the process. Client: The customers who fund, support, and approve the project
76
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered Level 1: Reaction and Planned Action Was the program relevant to participants’ jobs and mission? Was the program important to participants’ job/mission success? Did the program provide new information? Do participants intend to use what they learned? Would participants recommend it to others? Is there room for improvement with facilitation, materials, and the learning environment?
77
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered Level 2: Learning and Confidence Do participants know what they are supposed to do with what they learned? Do participants know how to apply what they learned? Are participants confident to apply what they learned? Did participants gain new knowledge, change their attitude, increase awareness?
78
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered Level 3: Application and Implementation How effectively are participants applying what they learned? How frequently are they applying what they learned? If they are applying what they learned, what is supporting them? If they are not applying what they learned, why not?
79
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered Level 4: Business Impact So what? To what extent does participant application of what they learned improve the measures the program was intended to improve? How did the program impact output, quality, cost, time, customer satisfaction, employee satisfaction, work habits? What were the consequences of participants’ application of knowledge and skills acquired during the program, process, intervention, change? How do we know it was the program that improved these measures?
80
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered Level 5: ROI Do the monetary benefits of the improvement in business impact measures outweigh the cost of the program?
81
Chain of Impact Reaction & Planned Action Learning & Confidence
Application & Implementation Isolate the Effects of the Program Impact ROI Intangible Benefits
82
Assessment Objectives Evaluation
Needs Program Assessment Objectives Evaluation Potential ROI ROI Payoffs Objectives 5 5 Business Impact Business Needs Objectives Impact 4 4 3 Job Performance Application Application Needs Objectives 3 2 Skills/Knowledge Learning Learning Needs Objectives 2 1 Preferences Satisfaction Reaction Objectives 1
83
Matching Evaluation Levels with Objectives
Reaction Learning Application Impact Return on Investment
84
The ROI Methodology Evaluation Data Collection Planning Develop
LEVEL 3: APPLICATION AND IMPLEMENTATION LEVEL 1: REACTION AND PLANNED ACTION Develop Objective of Solution(s) Develop Evaluation Plans and Baseline Data Collect Data During Solution Implementation Collect Data After Implementation LEVEL 2: LEARNING AND CONFIDENCE LEVEL 4: BUSINESS IMPACT
85
Data Analysis Reporting
Tabulate Costs of Solution Data Analysis Reporting Generate Impact Study Isolate the Effects Convert Data To Monetary Value Calculate the Return On Investment LEVEL 5: ROI Identify Intangible Measures INTANGIBLE BENEFITS
86
Evaluation Planning Develop/Finalize Objectives
Reaction Learning Application Impact ROI
87
Evaluation Planning Data Collection Plan
Broad Program Objectives Measures Data Collection Method/Instruments Data Sources Timing Responsibilities
88
Evaluation Planning ROI Analysis Plan
Data Items (Usually Level 4) Methods for Isolating the Effects of the Program/Process Methods of converting Data to Monetary Values Cost Categories Intangible Benefits Communication Targets for Final Report Other Influences/Issues during Application Comments
89
Evaluation Planning Project Plan
Major Milestones Deliverables Timelines Flow
90
Data Collection During Program
Level 1 Level 2 Method Surveys Questionnaires Observation Interviews Focus Groups Tests/Quizzes Demonstrations Simulations
91
Data Collection Post Program
Method Level 3 Level 4 Surveys Questionnaires Observations on the job Interviews Focus Groups Action planning/improvement plans Performance contracting Performance monitoring
92
Isolating the Effects of
the Program Use of control groups Trend line analysis Forecasting methods Participant’s estimate Management’s estimate of impact (percent) Use of experts/previous studies Calculate/Estimate the impact of other factors Customer input
93
Isolating the Effects of the Program
Method1 Best Practice Use2 Comparison Group Analysis 35% Trend/Forecasting Analysis 20% Expert Estimation 50% Other 20% NOTES: Which techniques are appropriate in your organization? 1 Listed by credibility 2 Percentages exceed 100%
94
Example - Use of Control Groups
Customer Service Compensation Six sites chosen for program evaluation Each site had a control group and an experimental group randomly selected Experimental group received new plan - control group did not Observed performance for both groups at the same time
95
Use of Trend Line Analysis Shipment Productivity
Percent of Schedule Shipped Shipment Productivity 100% Team Implementation Actual Average 94.4% 95% Average of Trend Projected 92.3% 90% Pre Program Average 87.3% Trend Projection 85% J F M A M J J A S O N D J Months
96
Example of a Participant’s Estimation
97
Converting Data to Money
Profit/savings from output (standard value) Cost of quality (standard value) Employee time as compensation (standard value) Historical costs/savings from records Expert input External studies Linking with other measures Participant estimation Management Estimation Estimation from staff
98
Converting Data to Money
Credibility Resources Needed Standard values High Low Records/Reports analysis Databases Moderate Expert Estimation
99
Example of Converting Data Using External Database
Cost of one turnover* Middle Manager $70,000 annual salary Cost of turnover 150% Total cost of turnover $105,000 * External data - value obtained from industry related study
100
Cost of a Sexual Harassment Complaint
35 Complaints Actual Costs from Records Additional Estimated Costs from Staff Legal Fees, Settlements, Losses, Material, Direct Expenses EEO/AA Staff Time, Management Time $852,000 Annually $852,000 35 Cost per complaint = $24,343
101
Example of Linkage with Other Measures
A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest Attitude About the Job Customer Retention Employee Recommendations Behavior About the Company Service Helpfulness Impression Merchandise Value Return on Assets Operating Margin Revenue Growth 5-Unit Increase in Employee Drives 1.3-Unit Increase in Customer 0.5 Increase in Revenue Growth
102
Tabulating Program Costs
Direct Program Materials Facilitator Costs Facilities Travel Indirect Needs Assessment Program Development Participant Time Administrative Overhead Evaluation
103
Stress Intangible Benefits Teamwork Complaints Customer Service
Commitment Customer Service Conflicts Engagement Job Satisfaction
104
ROI Process Flexibility
Look Forward Pre-program ROI forecast End-of-program ROI estimation Examine Accomplishments Application data (Level 3) Impact data (Level 4)
105
Do not Confuse the CFO ROI – Return on Investment not Information . . . not Intelligence . . . not Inspiration . . . not Involvement ROE – Return on Equity not Expectation ROA – Return on Assets not Anticipation ROCE – Return on Capital Employed . . . not Client Expectation
106
Common Target Audiences
Reason for Communication Primary Target Audience Secure approval for program Client, top executives Gain support for the program Immediate managers, team leaders Build credibility for the training Top executives staff Enhance reinforcement of the Immediate managers program Enhance results of future programs Participants Show complete results of the Key client team Stimulate interest in HR programs Top executives Demonstrate accountability for client expenditures All employees Market future HR programs Prospective clients
107
Select Media Full report Executive summary General overview
Impact Studies Full report Executive summary General overview One-page summary Meetings Executive meetings Manager meetings Staff meetings Panel discussions Best practice meetings Internal Publications Announcements Bulletins Newsletters Magazines Progress Reports Schedules Preliminary results Memos Case Studies Program Brochures Scoreboards Electronic Media Web sites Video blogs
108
Implementation Issues
Resources (staffing / budget) Leadership (individual, group, cross functional team) Timing (urgency, activities) Communication (various audiences) Commitment (staff, managers, top executives) Notes: ____________________________________
109
Key Implementation Actions
Determine /establish responsibilities Develop skills /knowledge with ROI Develop transition / implementation plan Conduct ROI studies Prepare /revise/evaluation /policy/ procedures/guidelines Train/brief managers on the ROI Process Communicate progress/results
110
Retail Merchandise Company
111
Utility Services Company
112
Utility Services Company Business Impact
Monthly Improvement in Six Months A Percent Contribution From Team Building B Average Confidence Estimate (Percent) C Adjusted Improvement in Six Months A x B x C Productivity 23% 57% 86% 11.3% Quality 18% 38% 74% 5% Efficiencies 14.5% 64% 91% 8.4%
113
Utility Services Company
Program Costs for 18 Participants = $54,300 Annualized, First Year Benefits Productivity 197,000 Quality 121,500 Efficiency 90,000 408,500 ROI = x = 652% 408, ,300 54,300
114
Matching Exercise: The Twelve Guiding Principles of ROI
115
Regional Public Utility
116
Level 3 and 4 Objectives Provide:
Direction to designers and developers Guidance to instructors and facilitators Goals for participants Satisfaction for program sponsors A framework for evaluators Notes: Explain each of the above: Developers:_________________________________________________________________________________________________________________________________________________________________Facilitators:_________________________________________________________________________________________________________________________________________________________________Participants:_________________________________________________________________________________________________________________________________________________________________Sponsors:___________________________________________________________________________________________________________________________________________________________________Evaluators:__________________________________________________________________________________________________________________________________________________________________
117
Assessment Objectives Evaluation
Needs Program Assessment Objectives Evaluation Potential ROI ROI Payoffs Objectives 5 5 Business Impact Business Needs Objectives Impact 4 4 3 Job Performance Application Application Needs Objectives 3 2 Skills/Knowledge Learning Learning Needs Objectives 2 1 Preferences Satisfaction Reaction Objectives 1
118
Linking Needs Assessment with Evaluation
Program Objectives Evaluation 4 3 2 1 4 3 2 1 An absenteeism problem exists Discussions between team leader/supervisor are not occurring when there is an absence Deficiency in counseling/discussion skills Supervisor prefers to attend training one day per week Weekly absenteeism rate will reduce Counseling discussions conducted in 95% of situations when an unexpected absence occurs Counseling discussion skills will be acquired/enhanced Program receives favorable rating of 4 out of 5 on the structure of program Monitor absenteeism data for six months Follow-up questionnaire to participants to check frequency of discussions - three months Skill practice sessions during program Reaction questionnaire at the end of program
119
Application Objectives
Project Business Alignment and Forecasting The ROI Process Model V Model Learning Needs Preference Needs Measurement and Evaluation Reaction Learning Application Impact ROI Reaction Objectives Learning Objectives Application Objectives Performance Needs Impact Objectives Business Needs Payoff Needs ROI Objectives End Here Start Here 5 4 3 2 1 Initial Analysis
120
Program Alignment V-Model
Needs Objectives Evaluations End Here Start Here Payoff Needs ROI Objectives ROI Absenteeism is costing $10,000 monthly. Impact Objectives Business Needs Impact Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5% Job Performance Needs Application Objectives Application Discussions between team member and supervisor are not occurring when there is an unplanned absence. Learning Objectives Learning Needs Learning Deficiency in counseling/ discussion skills. Measurement and Evaluation Initial Analysis 1 Reaction Objectives 1 Preference Needs Reaction One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors Project Business Alignment and Forecasting The ROI Process Model
121
Program Alignment V-Model
Needs Objectives Evaluations End Here Start Here Payoff Needs ROI Objectives ROI Absenteeism is costing $10,000 monthly. Impact Objectives Business Needs Impact Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5% Job Performance Needs Application Objectives Application Discussions between team member and supervisor are not occurring when there is an unplanned absence. Learning Objectives Learning Needs Learning Deficiency in counseling/ discussion skills. Measurement and Evaluation Initial Analysis 1 Reaction Objectives 1 Preference Needs Reaction One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors Project Business Alignment and Forecasting The ROI Process Model
125
Nissan Motor Manufacturing Company
126
Wachovia Bank
127
Metro Hospital
128
Regional Health Center
129
Department of Internal Affairs
130
International Car Rental
131
Results-Based Approach
Performance Assessment and Analysis Process Level 5 Level 4 Level 3 Level 3 and 2 Level 1 Problem/ Opportunity Present or Anticipated Identify Job Performance Needs, Gaps, and Why Identify Solutions Identify Transfer Strategy Options and L-2 & L-3 Support For All Stakeholders Identify Preferences Develop Objectives/ Evaluation Strategy Identify Business, Needs, Gaps, and Stakeholders Each level includes: Data Sources Data Collection Key Questions Key Issues Specify Skill/ Knowledge Deficiencies of Affected Population Training Required ? Level 2 Design Solution and Stakeholder Components Consider Resources/ Logistics Delivery Develop Content/ Materials Implement Pre-activity Conduct/ Implement Solution Implement Transfer Strategy - Solution Transfer Strategy Level 1 Level 2
132
Phillips ROI Methodology TM
Tabulate Costs of Solution Isolate the Effects of the Solution Convert Data To Monetary Value Collect Data After Solution Implementation Develop Report and Communicate Results Calculate The Return On Investment Level 5 Level 3 Level 4 Identify Intangibles Significant Influences Policy Statement Procedures and Guidelines Staff Skills Management Support Technical Support Organizational Culture Intangible Benefits
133
Key Alignment Questions
Is this a problem worth solving? Is there a potential pay off? Needs Program Assessment Objectives Evaluation 5 Potential ROI ROI Payoffs Objectives 5 What is the actual ROI? What is the BCR?
134
Key Alignment Questions
What is the specific measure? What happens if we do nothing? Needs Program Assessment Objectives Evaluation 4 Business Impact Business Needs Objectives Impact 4 Which business measure improved? How much is related to the program?
135
Key Alignment Questions
What is occurring or not occurring on the job that influences the business measure? Needs Program Assessment Objectives Evaluation 3 Job Performance Application Application Needs Objectives 3 What has changed? Which skills/knowledge has been applied?
136
Key Alignment Questions
What skills or knowledge is needed to support the job performance need? Needs Program Assessment Objectives Evaluation 2 Skills/Knowledge Learning Learning Needs Objectives 2 What did they learn? Who did they meet?
137
Key Alignment Questions
How should the solution be structured? Needs Program Assessment Objectives Evaluation 1 Satisfaction Preferences Objectives Reaction 1 What was the reaction to the program? Do we intend to implement the program?
138
Developing Reaction Objectives
Measure At the end of course, participants will perceive program content as relevant to their jobs. 80% of participants rate program relevance a 4.5 out of 5 on Likert scale.
139
Developing Learning Objectives
Measure At the end of the course, participants will be able to implement Microsoft Word. Within a 10-minute time period, participant will be able to demonstrate to the facilitator the following applications on Microsoft Word with zero errors. File, Save as, Save as Web Page Format, including font, paragraph, background, and themes Insert tables, add columns and rows, and delete columns and rows.
140
Developing Application Objectives
Measures Participants will use effective meeting behaviors. Participants will develop a detailed agenda outlining the specific topics to be covered for 100% of meetings. Participants will establish meeting ground rules at the beginning of 100% of meetings. Participants will follow up on meeting action items within three days following 100% of meetings.
141
Developing Impact Objectives
Measures Increase market share. Increase market share of young professionals by 10% within nine months of new ad launch Improve the quality of the X-1350 Reduce the number of warranty claims on the X-1350 by 10% within six months after the program. Improve overall customer satisfaction with quality of the X by 10% as indicated by customer satisfaction survey taken six months after the program. Achieve top scores on product quality measures included in industry quality survey.
142
Developing Level 3 and 4 Objectives
143
Evaluation Targets (Large Telecommunications Company)
144
Criteria For Selecting Programs For Level 3 Evaluation
Significant gaps in performance suspected Safety and health of employees at risk Learning transfer significantly important to customer service / satisfaction goals Learning transfer significantly important to success of company strategic initiatives Pilot program delivered
145
Criteria for Selecting Programs for Level 4 & 5 Evaluation
The life cycle of the program The linkage of the program to operational goals and issues The importance of the program to strategic objectives The cost of the program Visibility of the program The size of the target audience The investment of time A comprehensive needs assessment is conducted Top executives are interested in the evaluation
146
Evaluation Planning Meeting Who Should Be Involved?
Program owner Program designer Program analyst Program facilitator Business unit partner Subject matter expert Typical participant
147
Evaluation Planning Meeting Factors For Success
Credible sources Access to data Complete coverage Move quickly Consider outputs to be drafts Sponsor sign-off
148
Evaluation Planning Meeting Agenda
Explain purpose Finalize/adjust objectives Complete data collection plan Complete ROI analysis plan step by step Compile ROI project plan step by step
149
Global Financial Services, Inc.
150
Data Analysis and Results
Results at Level 1 Rating of 4.23 out of 5 achieved on the relevance of ACT! for specific job applications. 92.5% of participants indicated an intention to use ACT! within two weeks of the workshop.
151
Data Analysis and Results
Results at Level 2 83% of the participants scored 75 or better on the ACT! use test. Participants successfully demonstrated an average of 4.2 out of 5 key features of ACT! which are . . .
152
Key Features of Act! Enter a new contact Create a mail-merge document
Create a query Send an Create a call report
153
Data Analysis and Results
Results at Level 3 Participants indicated that within 10 days, 92% of new customer prospects are entered into the system. Participants report an increase in the number of planned follow-up contacts with customers. Unscheduled audit of daily use resulted in a score of 76% out of a possible 100%.
154
Average Monthly Change
Results at Level 4 Impact Measure Average Monthly Change Contribution of ACT! Annual Value Customer Complaints 24.2 - 43% $ 575,660 Customer Response 18 minutes per customer 72% N/A Sales to Existing Customers $321,000 + 14% $539,280 Customer Satisfaction 26% + 49% Total: $1,114,940
155
Project Costs Development Costs $ 10,500 Materials/Software $ 18,850
Equipment $ 6,000 Instructor (Including Expenses) $ 7,200 Facilities/Food/Refreshments $58 $ 3,480 Participants Time (Lost Opportunity) $385 $ 22,330 Coordination/Evaluation $ 15,600 Total: $ 83,960
156
ROI Calculation $1,114,940 - $83,960 X 100 = 1228% $ 83,960 ROI (%) =
157
Program Profile Title: Interactive Selling Skills
Target Group: Sales Associates in Electronics Vendor Produced and Delivered 3 Days - (2 Days Plus 1 Day) Significant Use of Skill Practices 3 Groups Trained (48 Participants from 3 Stores)
158
ROI Analysis Profile Post Program Data Collection
(4) Performance Monitoring months (3) Questionnaire months (3) Program Follow-up Session weeks (last session) Isolating the Effects of Training Control Group Arrangement Participant’s Estimate (For Back-up) Converting Data to Monetary Values Profit Contribution of Increased Output
159
Level 1 - Selected Data Success with Objectives 4.3
Relevance of Material 4.4 Usefulness of Program 4.5 Exercises/Skill Practices 3.9 Overall Instructor Rating 4.1
160
All Participants Demonstrated
Level 2 - Selected Data All Participants Demonstrated That They Could Use The Skills Successfully
161
Level 3 - Selected Data (2 Questions out of 20)
I utilize the skills taught in the program Frequency of use of skills
162
Level 4 Data: Average Weekly Sales
Post Training Data Weeks After Training Trained Groups Control Groups $ 9, $ 9,698 , ,720 , ,812 $13, $11,572 , ,683 , ,092 Average for Weeks $12, $10,449 13, 14, 15
163
Annualized Program Benefits
46 participants were still in job after 3 months. Average Weekly Sales per Employee Trained Groups $12,075 Untrained Groups 10,449 Increase 1,626 Profit Contribution (2% of Store Sales) 32.50 Total Weekly Improvement (x 46) 1,495 Total Annual Benefits (x 48 Weeks) $71,760
164
Cost Summary 48 participants in 3 courses
Facilitation Fees: 3 $ $11,250 Program Materials: $35/participant ,680 Meals/Refreshments: ,032 3 $28/participant Facilities: 9 $ ,080 Participant Salaries Plus Benefits (35% factor) ,442 Coordination/Evaluation ,500 Total Costs $ 32,984
165
Level 5 Data BCR = = ROI (%) = X 100 =
166
ROI Example: Retail Merchandise Company
Tabulating Program Costs $32,984 Calculating the Return on Investment Converting Data to Monetary Value Collecting Post Program Data Isolating the Effects of the Program Follow-up Session Questionnaire Performance Monitoring Control Groups Participants’ Estimates Standard Values $71,760 118% Identifying Intangible Benefits
167
The ROI Process Takes A Balanced View by Measuring And Reporting:
Reaction to program Learning and attitudes Application on the job Impact in work unit Impact on the customer The financial results Intangible benefits Nature and source of problems and opportunities
168
The Business Case for EI
169
Self Test: How Results-Based Are Your Human Resources Programs?
170
Data Collection Issues
Objectives Type of data Instruments Methods Sources of data Timing of collection Responsibilities
171
Collecting Program Data
Level Surveys Questionnaires Observation Interviews with Participants Focus Groups Tests Action Planning Performance Contracting Performance Monitoring
172
Classic Evaluation Instruments
Questionnaires Surveys Tests Interviews Focus Groups Observation Performance Records
173
Applications of Data Collection Instruments
Matching Exercise Focus groups Observation Performance Records Survey Test Questionnaire Interview
174
Data Collection Exercise Part 1
Program: 3-day Leadership Workshop Audience: 50 Middle Level Managers Level 3 Objectives: Apply 11-step goal setting process with each employee three months after workshop Apply techniques that influence motivational climate within three months Apply techniques that inspire teamwork Apply coaching techniques to enhance employee engagement Level 4 Objective: Improve business measures, important to your work unit
175
Survey/Questionnaire Design
Determine the specific information needed Review information with stakeholders Select the type(s) of questions Keep questions and statements simple Develop the questions Design for easy tabulation and analysis and . . .
176
Survey/Questionnaire Design
Check the reading level Address the anonymity issue Test the questions Review results of the field test Develop the completed questionnaire Develop administrative procedures
177
Common Mistakes in Survey/Questionnaire Design
Vague statements/questions Too many questions Reading level too high Improperly worded questions Confusing instructions Too difficult to analyze
178
Questionnaire Design Checklist
Is the overall length appropriate? Is it a valid instrument? Is it a reliable instrument? Do the questions flow properly? Are the types of questions appropriate for the information desired?
179
Questionnaire Design Checklist
Are the questions designed to take advantage of data comparisons? Is it designed to minimize distortion? Are the questions designed to ease data tabulation and analysis? Have administrative issues been addressed? Is it easy to read?
180
Questionnaire Design Checklist
Are the instructions clear? Have steps been taken to ensure confidentiality? Have provisions been made for demographic data? Is the appearance of the questionnaire adequate? Is a pre-test scheduled?
181
Selecting Survey Scales
Variance – are there enough choices? Discrimination – can you tell the difference between choices? Accuracy – do the scale labels accurately describe the choices? Symmetry – is the scale balanced appropriately?
182
What Makes an Effective Survey Question?
Focus – every question should focus on a single issue or specific topic Brevity – short questions present less opportunity for measurement error Clarity – clear questions are understandable to all respondents
183
Types of Tests Objective Criterion reference tests Norm referenced
Performance tests
184
Types of Objective Tests
True/false Matching items Multiple choice items Fill in the blank items Short answer items Essay items
185
Steps to Developing Objective Tests
Focus on one set of related course objectives at a time Determine behavioral evidence of capability related to these objectives Select a format and an item type that fits the objectives Develop 3 to 5 items for each objective Sequence items in a logical order Prepare test instructions that are simple and easy to understand Pilot test
186
Structured and Unstructured
Interview Design Structured and Unstructured List basic questions to be asked. Follow the same principles as survey/ questionnaire design. Allow for probing. Try out the interview. Prepare the interviewers. Provide instructions to the individual being interviewed. Administer the interviews consistently.
187
Focus Group Guidelines
Select topics, questions, and strategy carefully. Keep the group size small. Ensure that there is a representative sample of the target population. Insist on facilitators having appropriate expertise. Stay on track and on time Allow equal time for all participants Control over-talking and under-talking
188
Observation Guidelines
Observations should be systematic Observers should know how to interpret and record what they see Observer’s influence should be minimized Observers must be carefully selected Observers must be prepared
189
Observation Methods Behavior Checklist Coded Behavior Record
Delayed Report Method Video Recording Audio Monitoring Computer Monitoring (software)
190
Typical Sources of Performance Data
Operating reports Departmental reports Work unit audits Key performance indicators Six Sigma reports Scorecards Dashboards
191
Monitoring Performance Data
Identify appropriate data sources. Collect data related to objectives only. Develop new data as needed. Convert current data to usable items. Develop a collection plan to include Who, What, Where, and When.
192
Characteristics of Effective Instruments
Valid Reliable Simple Economical Easy to administer Easy to analyze data
193
Factors to Consider When Selecting Data Collection Methods
Time required for participants Time required for participant’s supervisor Costs of method Amount of disruption of normal activities Accuracy Utility Culture / Philosophy
194
Sources of Information for Program Evaluation
Participants Supervisors of participants Subordinates of participants Peer group Internal staff External Group Organizational performance records
195
Factors to Consider When Determining Timing of Follow-Up
Availability of data Ideal time for behavior change (level 3) Ideal time for business impact (level 4) Convenience of collection Constraints on collection
196
Data Collection Exercise
Program: 3-Day Leadership Workshop Audience: 50 Middle Level Managers (2 Groups) Follow Up: Anonymous questionnaire in 3 months to collect application and impact data Assignment: 1. What topics should be included in the questionnaire?
197
Cyber International
198
Sales Culture at Progress Bank
199
Developing ROI with Action Planning
Communicate the action plan requirement early. Describe the action planning process at the beginning of the intervention. Teach the action planning process. Allow time to develop the plan. Have the facilitator approve the action plan. Require participants to assign a monetary value for each improvement. and . . .
200
Developing ROI with Action Planning
Ask participants to isolate the effects of the program. Ask participants to provide a level of confidence for estimates. If possible, require action plans to be presented to the group. Explain the follow-up mechanism. Collect action plans at the pre-determined follow-up time. Summarize the data and calculate the ROI.
201
Performance Contract Process Steps
The participant and supervisor mutually agree on a subject for improvement. A specific measurable goal(s) is set. The learning participates in the program. The contract is discussed, and plans are developed to accomplish the goals. Notes: What key points should be included in a performance contract for a sales manager to attend a 3-day sales management workshop? ____________________________________ ____________________________________ and . . .
202
Performance Contract Process Steps
After the program, the participant works on the contract against a specific deadline. The participant reports the results of the effort to his supervisor. The supervisor and participant document the results for the staff. Notes: What key points should be included in a performance contract for a sales manager to attend a 3-day sales management workshop? ____________________________________ ____________________________________
203
The Performance Contract Should Be:
Written Understandable (by all involved) Challenging (requiring a concentrated effort to achieve) Achievable Largely under the control of the participant Measurable and dated Notes: Can you add other items to the list? ____________________________________ ____________________________________
204
Response Rate Exercise
Program: 3-Day Leadership Workshop Audience: 50 Middle Level Managers (2 Groups) Follow Up: Anonymous questionnaire in 3 months to collect application and impact data, using a 5-page questionnaire Assignment: 1. How many responses do you need? 2. How are you going to ensure you receive the appropriate number response?
205
Follow-Up Questionnaire Checklist
Progress with objectives Action plan implementation Relevance of program Perceived value Use of materials Knowledge/skill enhancement Skills used Changes with work Linkage with output measures Notes: ____________________________________
206
Follow-Up Questionnaire Checklist
Other Benefits Barriers Enablers Management support Other solutions Recommendations for target audience Suggestions for improvement Other comments Notes: ____________________________________
207
Follow-Up Questionnaire Checklist
(OPTIONAL) Improvements/accomplishments Improvement linked with program Monetary impact Confidence level Notes: ____________________________________
208
Impact Questions for Follow-Up Evaluation
How did you use the material from this program? What influence did it have in your work? Team? What is the specific measure influenced? Define it. What is the unit value of the measures? (Profit or Cost) What is the basis of this value? How much did the measure change since the program was conducted? and . . .
209
Impact Questions for Follow-Up Evaluation
What is the frequency of the measure? Daily, Weekly, Monthly, Etc What is the total annual value of improvement? What are the other factors that could have caused this total improvement? What percent of the total improvement can be attributed to this program? What is your confidence estimate of the above data? 0% = No confidence; 100% = Certainty
210
Performance Contract Sample
211
Option 1, When You Don’t Have a Clue
Option 2, When the Measure is in a Defined Set Option 3, When the Measure is Known
212
Increasing Response Rates
Provide advance communication Clearly communicate the reason for the questionnaire Indicate who will see the results Show how the data will be integrated Keep the questionnaire simple and brief Make it easy to respond Use the local manager to help distribute the questionnaires and show support Let the target audience know that they are part of a carefully selected sample Notes: Can you add to this list? ____________________________________ and . . .
213
Increasing Response Rates
Use one or two follow-up reminders Have the introduction letter signed by a top executive Enclose a giveaway item with the questionnaire Provide an incentive for quick response Send a summary of results to target audience Distribute questionnaire to a captive audience Consider an alternative distribution channel Have a third party gather and analyze data. Notes: Can you add to this list? ____________________________________ and . . .
214
Increasing Response Rates
Communicate the time limit Consider paying for the time it takes to complete the questionnaire Review the questionnaire at the end of the formal session Carefully select the survey sample Allow completion of the survey during work hours Add emotional appeal and . . .
215
Increasing Response Rates
Design questionnaire to attract attention, with a professional format Let participants know what actions will be taken with the data Provide options to respond Use a local coordinator to help distribute and collect questionnaires Frame questions so participants can respond appropriately and make the questions relevant
216
First Bank Is this situation unusual? Please explain.
Should the CEO drop the issue? What are some approaches to resolve this dilemma? What would you do?
217
Isolating the Effects of a Program
Matching Exercise Control group Trend line analysis Forecasting Participant’s estimate Use of customer input Expert estimates
218
Several Factors Contribute to an Improvement After a Program in Conducted
External Factors Management Attention TOTAL IMPROVEMENT AFTER PROGRAM Incentives Systems/Procedures Changes HR Programs EFFECT OF HR ON IMPROVEMENT
219
Techniques to Isolate the Effects of Programs
Use of a control group arrangement Trend line analysis of performance data Use of forecasting methods of performance data Participant’s estimate of impact (percent) Supervisor’s estimate of impact (percent) Management’s estimate of impact (percent) Use of experts/previous studies Calculating/estimating the impact of other factors Use of customer input
220
Financial Services What are the major problems with the implementation of a control group arrangement illustrated in this case? How can these problems be tackled on a practical basis? Will the same strategy of using control groups work at your organization? Explain.
221
Use of Control Groups Customer service training
Six sites chosen for program evaluation Each site had a control group and an experimental group randomly selected Experimental group received training, control group did not Collected customer service data for both groups at the same time
222
Control Group Design Control Group M1 M2 Experimental Group M1 Program
223
Post-Test Only, Control Group Design
Measurement Experimental Group Program Measurement
224
Ideal Experiment Design
Group A M1 Program M2 Group B M1 M2 Group C Program M3
225
Control Group Problems
It is inappropriate in many settings Selection of groups Contamination of control group Duration / timing Influences are inconsistent Too research-based for some organizations
226
Using Pre Data as a Base 1.45% .7% Post Program Six-Month Average
Micro Electronics, Inc. 1.85% Pre Program Average CPI Program Conducted 2% 1% Projected Average – Using Pre Data as a Base 1.45% REJECT RATE .7% Post Program Six-Month Average J F M A M J J A S O N D J MONTHS
227
Questions for Discussion
Approximately what improvement in reject rate has resulted from the program? How reliable is this process? When can this process be used?
228
Formal Internal Complaints of Sexual Harassment
Healthcare, Inc. Projected Value Pre-Program Average Complaints Post-Program Average Sexual Harassment Prevention Program O N D J F M A M J J A S O N D J F M A M J J A S O Time Formal Internal Complaints of Sexual Harassment
229
Use of Trend Line Analysis Percent Of Schedule Shipped
Shipment Productivity 100.00% Team Training Program Actual Average 94.4% 95.0% Percent Of Schedule Shipped Average of Trend Projected 92.3% 90.0% Pre Program Average 87.3% Trend Projection 85.0% J F M A M J J A S O N D J Months
230
Conditions for Trend Line Analysis Use
Pre-program data available Data items are stable Pre-program influences expected to continue No new influences enter the post-program period except for program
231
Woody’s What is the impact of the sales training program on sales?
Is this process feasible in your organization? Explain.
232
• • • • Program Conducted Impact of Training Program $160
1800 1600 1400 1200 1000 800 600 400 200 $1500 $1340 $1100 • • • • Impact of Advertising $240 Y = x
233
National Bank
234
Monthly Increase: 175 Contributing Factors Average Impact on Results
Average Confidence Level Sales Training Program 32% 83% Incentive Systems 41% 87% Goal Setting/Management Emphasis 14% 62% Marketing 11% 75% Other 2% 91%
235
Questions for Discussion
What is the number of new credit card accounts per month that can be attributed to the sales training program? Is this a realistic process to estimate of the impact of the program on the increased sales? How could this process be improved?
236
Using Estimates to Isolate the Effects of a Program
Describe the task and the process. Explain why the information was needed and how it will be used. Ask participants to identify any other factors that may have contributed to the increase. Have participants discuss the linkage between each factor and the specific output measure. and . . .
237
Using Estimates to Isolate the Effects of a Program
Provide participants with any additional information needed Obtain the actual estimate of the contribution of each factor. The total must be 100%. Obtain the confidence level from each employee for the estimate for each factor (100%=certainty; 0%=no confidence). The values are averaged for each factor.
238
The Power of Estimates Research Comparison with other methods
Handling objections Management reactions Participant reactions
239
Key Issues with Estimates
Use as a last resort Use most credible source for data Collect data in an unbiased way Adjust for error Report it carefully
240
Credibility of Data Which of these items have the most credibility? Rank them. Why are these items credible or not credible? List all the factors that influence the credibility of data. Why are we uncomfortable using estimates in our programs?
241
Credibility of Outcome Data is Influenced by the:
Reputation of the source of the data Reputation of the source of the study Motives of the researchers Personal bias of audience Methodology of the study Assumptions made in the analysis Realism of the outcome data Type of data Scope of analysis
242
Other Isolation Methods
Supervisors Managers Experts Previous studies Customers
243
Use of Participants’ & Managers’ Estimate of Training’s Impact
Factor Participants Managers ISDN knowledge, skills, or experience graduates had before they attended the training % 14% ISDN knowledge, skills or experience graduates gained from the training % 36% ISDN knowledge, skills, or experience graduates acquired on their own after the training % 12% ISDN reference material or job aids unrelated to the training, e.g. bulletins, methods & procedure documentation % 9% Coaching or feedback from peers % 18% Coaching or feedback from graduates’ managers % 5% Observation of others % 6%
244
National Computer Company (A)
245
Questions for Discussion
Is this an appropriate opportunity for using a control group? Explain. What factors should be considered when selecting the groups? What other options should be explored? When should the attempt to use control groups be abandoned?
246
National Computer Company (B)
247
Program Implementation
42% 40% 38% 36% Δ Program Implementation VOLUNTARY TURNOVER RATE J F M A M J J A S O MONTH
248
Questions for Discussion
Can a trend line analysis be used? What conditions must be met for this approach to be used? How credible is this approach?
249
National Computer Company (C)
250
VOLUNTARY TURNOVER RATE
Y = 50 – 3(X) 38% 36% 34% 32% 30% VOLUNTARY TURNOVER RATE 4% % % % UNEMPLOYMENT RATE
251
Questions for Discussion
How can this data be used to isolate the effects of the HR program? How much of a reduction in voluntary turnover is attributed to the increase in the unemployment rate? What cautions and concerns should be considered?
252
National Computer Company (D)
253
Average Confidence Level
Contributing Factors Impact on Results Average Confidence Level HR Program 30% 80% Unemployment rate 50% 100% Management Emphasis 5% 70% Competition 15% 90%
254
Questions for Discussion
Who should provide the input on this isolation estimate? How should the data be collected? What makes this process credible? What makes this process not so credible?
255
Wisdom of Crowds In this case, the average estimate is near perfect
Estimates are used everywhere Set up your own experiment Estimates should be adjusted Estimates are okay – defend them; don’t prefer them
256
Multi National, Inc. (A) Critique the way in which the data was analyzed to develop the final value. What would you have done differently? Do you think that program benefits should be communicated without the cost of the program? Explain. What cautions or concerns should be addressed when communicating impressive results from training programs?
257
Multi National, Inc. (B) What is the ROI of this program?
How does this value compare with the one previously reported? Which value would you use? Is there a way to integrate the two studies? How do you assess the credibility of this process?
258
Examples of Hard Data Output Costs Time Quality
259
Characteristics of Hard Data
Objectively based Easy to measure and quantify Relatively easy to assign monetary values Common measures of organizational performance Very credible with management
260
Examples of Soft Data Work Habits Initiative/ Innovation
Customer Service Employee Development/ Advancement Work Climate/ Satisfaction
261
Characteristics of Soft Data
Subjectively based in many cases Difficult to measure and quantify, directly Difficult to assign monetary values Less credible as a performance measure Usually behaviorally oriented
262
Converting Data to Money
Matching Exercise Profit/savings from output Cost of quality Employee time as compensation Historical cost/savings from records Expert input External database Linking with other measures End user/performer estimation Management estimation Estimation from HR staff
263
Five Steps to Convert a Measure to Money
Unit of improvement Value of each unit (V) Unit performance change (Δ) Annual performance level change (Δ P) Improvement value (V times Δ P) Notes: ____________________________________
264
Example Spend about 4 minutes with your team to calculate the annual monetary value of improvement in grievances. Step 1: 1 Grievance Step 2: V = $6,500 Step 3: Δ P = Reduction of 7 grievances per month due to the program Step 4: A Δ P = Step 5: A Δ P x V=
265
Converting Data Converting output to contribution – standard value
Converting the cost of quality – standard value Converting employee’s time – standard value Using historical costs Using internal and external experts Using data from external databases Using participants’ estimates Linking with other measures Using supervisors’ and managers’ estimates Using staff estimates
266
Data Conversion Issues
Use the most credible sources If two credible sources are available, use the most conservative option Adjust for the time value of money Know when to stop this process
267
Standard Values are Everywhere
Finance and Accounting Production Operations Engineering IT Marketing and Customer Service Procurement Research and Development HR
268
Examples of Techniques Convert Data to Monetary Value
Data Conversion Techniques Examples Standard Values Output to Contribution Cost of Quality Employee Time Sales - Profit margin Donations - Overhead margin Unproductive man hours - Hourly Wage* Repackaging – Standard value based on time savings (hourly wage) OSHA fines – Fines associated with incident Unit Per Person Per Hour – Profit of one additional product produced per person per hour at same cost
269
Examples of Techniques Convert Data to Monetary Value
Historical Costs Sexual harassment grievances – Litigation costs Food spoilage – Cost to replenish food inventory Turnover marine engineers – Average replacement costs plus separation costs Internal / External Experts Electric utility rate – Internal economist Life – Internal risk manager External Databases Turnover mid-level manager – ERIC Turnover restaurant wait staff – Google
270
Examples of Techniques Convert Data to Monetary Value
Link with Other Measures Employee satisfaction – Linked to customer satisfaction linked to profit Customer complaints regarding baggage mishandling – Percent complaints linked to percent who will not repurchase seat on airline linked to lost revenue Estimations Participant Supervisors/Managers Staff Unexpected absence – Supervisor estimate (basis provided) x confidence adjustment Unwanted network intrusions – Participant estimate (basis provided) x confidence adjustment
271
Cost of a Sexual Harassment Complaint
35 Complaints Actual Costs from Records Additional Estimated Costs from Staff Legal Fees, Settlements, Losses, Material, Direct Expenses EEO/AA Staff Time, Management Time $852,000 Annually $852,000 35 Cost per complaint = $24,343
272
Where to Find Experts The obvious department They send the report
It’s in the job title The directory Ask
273
What Makes an Expert Credible?
Experience Neutrality No conflict of interest Credentials Publications Track record
274
Converting Data Using External Database
Cost of one turnover Middle Manager $70,000 annual salary Cost of turnover 150% Total cost of turnover $105,000
275
Finding the Data Search engines Research databases Academic databases
Industry / trade databases Government databases Commercial databases Association databases Professional databases
276
Customer Satisfaction
Positive Correlation Customer Satisfaction Revenue
277
Classic Relationships
Job satisfaction Organization commitment Engagement Customer satisfaction Conflicts vs. Turnover Absenteeism Customer satisfaction Productivity Revenue
278
Linkage with Other Measures 1.3-Unit Increase in Customer
A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest Customer Recommendations Attitude About the Job Service Helpfulness Return on Assets Operating Margin Revenue Growth Customer Impression Employee Behavior Merchandise Value Attitude About the Company Employee Retention Customer Retention 1.3-Unit Increase in Customer Impression 5-Unit Increase in Employee Attitude 0.5 Increase in Revenue Growth Drives Drives
279
Estimating the Value Use the most credible source Check for biases
Discuss the value in general terms Provide information to assist in the estimates Collect data in a non-threatening way Adjust for the error
280
Turnover Cost Summary Entry level – hourly, non-skilled 30-50%
Job Type / Category Turnover Cost Ranges Entry level – hourly, non-skilled 30-50% Service / Production workers – hourly 40-70% Skilled hourly % Clerical / Administrative 50-80% Professional % Technical % Engineers % Specialists % Supervisors / Team Leaders % Middle Managers %
281
Turnover Costs Summary
Exit cost of previous employee Recruiting cost Employment cost Orientation cost Training cost Wages and salaries while training Lost productivity Quality problems Customer dissatisfaction Loss of expertise/ knowledge Supervisor’s time for turnover Temporary replacement costs
282
Converting Data: Questions to Ask
What is the value of one additional unit of production or service? What is the value of a reduction of one unit of quality measurement (reject, waste, errors)? What are the direct cost savings? What is the value of one unit of time improvement? Are cost records available? Is there an internal expert who can estimate the value? Notes: An HRD executive is quoted “the conversion of soft data to a monetary value creates an illusion of a precision that does not exist. As a result, we do not use soft data savings in any of our evaluation projects.” Do you agree? ____________________________________ and
283
Converting Data: Questions to Ask
Is there an external expert who can estimate the value? Are there any government, industry, or research data available to estimate the value? Are supervisors of program participants capable of estimating the value? Is senior management willing to provide an estimate of the value? Does the staff have expertise to estimate the value? Notes: An HRD executive is quoted “the conversion of soft data to a monetary value creates an illusion of a precision that does not exist. As a result, we do not use soft data savings in any of our evaluation projects.” Do you agree? ____________________________________
284
Short-Term Solutions Defined in terms of the time to complete or implement the program Is appropriate when this time is a month or less Is appropriate when the lag between Levels 3 and 4 is relatively short Reflects most HR solutions
285
When Estimating Time for Long-Term Solutions
Secure input from all key stakeholders (sponsor, champion, implementer, designer, evaluator) Be conservative Have it reviewed by Finance & Accounting Use forecasting
286
Converting Your Level 4 Measures to Money
Isolation Technique(s) Data Conversion Technique(s)
287
Total Fitness Company Calculate the annual savings from the improvement. Is this a credible process?
288
Absenteeism linked to program
7% - 4% = 3% 3% X 40% = 1.2% Absence days prevented 240 days X 120 employees X 1.2% = 346 days Monetary Value 346 days X $105/day = $36, or 346 X $90/day = $31,140
289
Data Conversion Test Is there a standard value?
Is there a method to get there? Move to intangible benefits No No Yes Yes Add to numerator With minimum resources? Move to intangible benefits No Yes
290
Convince it’s credible in 2 minutes? Move to intangible benefits No
Yes Convince it’s credible in 2 minutes? Move to intangible benefits No Yes Convert data and add to numerator
291
Reasons for Developing Cost Data
To determine the overall expenditure To determine the relative cost To predict future program costs To calculate benefits versus costs To improve the efficiency To evaluate alternatives To plan and budget To develop a marginal cost pricing system To integrate data into other systems Notes: ____________________________________
292
Issues About Tracking Costs
Monitor costs, even if they are not needed for evaluation Costs will not be precise Use a practical approach Minimize the resources to track costs Estimates are acceptable Use caution when reporting costs Do not report costs of a program without reporting benefits (or at least have a plan)
293
How Much Should You Spend on HR?
Overall Expenditures Total Expenditures Total – Human Capital % of Payroll % of Revenues % of Operating Costs Expenditures per Employee
294
How Much Should You Spend on HR?
Functional Area Needs Assessment Development Delivery/Implementation Operation/Maintenance Evaluation
295
Questions for Discussion
Is there a significant difference between estimated and actual costs? Explain. How did you determine what your targets would be? What should you spend?
296
Overall Cost Categories
Analysis costs Development costs Delivery costs Operating / Maintenance costs Evaluation costs Notes: Which cost categories are appropriate for your organization? ____________________________________ ____________________________________
297
Tabulating Program Costs
Recommended Items Needs assessment (prorated) Development costs (prorated) Program materials Facilitator / coordinator costs Facilities costs Travel / Lodging / Meals Participants’ time (salaries and benefits) Administrative / Overhead costs Operations / Maintenance costs Evaluation costs Notes: Which cost categories are included in your calculations? ____________________________________
298
Prorating Cost Life cycle approach Initial cost plus annual updates
299
Example of Prorating Leadership 101 5-year life cycle
200 participants per year $75,000 initial development costs 2 groups of 25 are being evaluated at the ROI level How much development costs should be charged to the ROI project?
300
Overhead Allocation Example
Portion of budget not allocated to specific projects $548,061 Total number of days dedicated to specific projects/programs 7,450 Per day overhead allocation $______ What is the total overhead allocation for the program that takes 3 days to complete?
301
Cost Estimating Worksheet
Costs Classification Matrix Cost Estimating Worksheet
302
Federal Information Agency (A)
What types of data should be collected for application and implementation? What business impact measures should be collected? What is the time frame for data collection? Which cost categories should be utilized in capturing the actual cost of the program? Can the value of this program be forecasted? If so, how?
303
Federal Information Agency (B)
Please calculate the actual cost of the program for 100 participants. Assume a 5% dropout rate each year. Most of these costs are estimated or rounded off. It this appropriate? Explain. What issues surface when developing cost data? How can they be addressed?
304
Different Approaches Cost Benefit Analysis Return on Investment
Payback Period Discounted Cash Flow Internal Rate of Return Utility Analysis Consequences of not providing learning systems Most Common Notes: ____________________________________
305
Defining the Benefit Cost Ratio
Program Benefits Program Costs Benefit/Cost Ratio = Example Program Benefits = $71,760 Program Costs = $32,984 BCR = 2.1756
306
Defining the Return on Investment
Net Program Benefits Program Costs ROI (%) = X 100 Example Net Program Benefits = $38,776 Program Costs = $32,984 ROI = 117%
307
Defining the Payback Period
Total Investment Annual Savings Payback Period = X 12 Example Total Investment = $32,984 Annual Savings = $71,760 Payback Period = .85 X 12 = 10.2 months
308
ROI Target Options Set the value as with other investments, e.g. 15%
Set slightly above other investments, e.g. 25% Set at break even - 0% Set at client expectations
309
A Rational Approach to ROI
Keep the process simple Use sampling for ROI calculations Always account for the influence of other factors Involve management in the process Educate the management team Communicate results carefully Give credit to participants and managers Plan for ROI calculations Notes: Identify (2) HRD program examples where a cost/benefit calculation would be appropriate to use. 1.________________________________ _________________________________ 2.________________________________ Identify (2) HRD program examples where a cost/benefit calculation would not be appropriate to use. 1.________________________________
310
The Journey to Increased Accountability
Level 1 (Reaction) Level 2 (Learning) Level 3 (Application) Level 4 (Business Impact) Level 5 (ROI) Profit Center Normal Accountability Time
311
Cautions When Using ROI
Take a conservative approach when developing both benefits and costs. Use caution when comparing the ROI in HR with other financial returns. Involve management in developing the methodology. Fully disclose the assumptions and methodology. and . . .
312
Cautions When Using ROI
Approach sensitive and controversial issues with caution. Teach others the methods for calculating the return. Recognize that not everyone will buy into ROI. Do not boast about a high return. Choose the place for the debates. Do not try to calculate the ROI on every program.
313
Improper Use of ROI ROI – return on information
ROI – return on intelligence ROI – return on involvement ROI – return on inspiration ROI – return on implementation ROI – return on initiative
314
ROI Myths ROI is too complex for most users.
ROI is too expensive, consuming too many critical resources. If senior management does not require ROI, there is no need to pursue it. ROI is a passing fad. ROI is too subjective ROI is for post analysis only
315
R O I The Potential Magnitude of an ROI % +
1,500 % + R O I 3. and an effective solution is implemented at the right time for the right people cost at a reasonable When with and and 5. linkage exists to one or more business measures 1. A need is identified 4. the solution is applied and supported in the work setting 2. a performance gap existing or a new requirement introduced
316
Guiding Principles 1. When a higher level evaluation is conducted, data must be collected at lower levels. 2. When an evaluation is planned for a higher level, the previous level of evaluation does not have to be comprehensive. 3. When collecting and analyzing data, use only the most credible sources. 4. When analyzing data, choose the most conservative among alternatives. At least one method must be used to isolate the effects of the meeting. If no improvement data are available, it is assumed that little or no improvement has occurred.
317
Guiding Principles 7. Estimates of improvement should be adjusted for the potential error of the estimate. 8. Extreme data items and unsupported claims should not be used in ROI calculations. 9. Only the first year of benefits should be used in the ROI analysis of short-term projects. 10. Meeting costs should be fully loaded for ROI analysis. Intangible measures are defined as measures that are purposely not converted to monetary value. The results from the ROI methodology must be communicated to all key stakeholders.
318
Typical Intangible Measures Linked with Programs
Job satisfaction Organizational commitment Climate Engagement Employee complaints Recruiting image Brand awareness Stress Leadership effectiveness Resilience Caring Career minded Customer satisfaction Customer complaints Customer response time Teamwork Cooperation Conflict Decisiveness Communication
319
Identification of Intangible Measures: Timing and Source
ROI Analysis Planning Needs Assessment Data Collection Data Analysis 1 2 3 4
320
Issues with Intangibles
May be the most important data set Are not converted to money by definition Are usually not subjected to “isolating” Must be systematically addressed Must be reported “credibly”
321
Reporting Intangibles
Usually presented as a table Must indicate how the data were collected Use rules to decide if a measure should be listed Be prepared for further analysis
322
Communication Challenges
Measurement and evaluation are meaningless without communication Communication is necessary for making improvements Communication is a sensitive issue Different audiences need different information
323
Communication Principles
Keep communication timely Target communication to specific audiences Stay unbiased and modest with the message Carefully select communication media Keep communication consistent with past practices Incorporate testimonials from influential individuals Consider your function’s reputation when developing the overall strategy Use language your audience understands
324
Audience Selection Questions
Are they interested in the program? Do they really want to receive the information? Has someone already made a commitment to them regarding communication? Is the timing right for this audience? Are they familiar with the program? How do they prefer to have results communicated? Are they likely to find the results threatening? Which medium will be most convincing to them?
325
Common Target Audiences
Reason for Communication Primary Target Audience Secure approval for program Client, top executives Gain support for the program Immediate managers, team leaders Build credibility for the staff Top executives Enhance reinforcement of the program Immediate managers Enhance results of future programs Participants Show complete results of the program Key client team Stimulate interest in programs Top executives Demonstrate accountability for client expenditures All employees Market future programs Prospective clients
326
Complete Report General information Methodology for impact study
Data analysis Costs Results Barriers and enablers Summary of findings Conclusions and recommendations Exhibits
327
The Impact Study Serves Several Purposes:
The method of communicating results, only for those audiences needing detailed information. As a reminder of the resources required to produce major studies. As a historical document of the methodology, instruments, and processes used throughout the impact study. A teaching and discussion tool for staff development.
328
Select Media Impact Studies Meetings Internal Publications
Full report Executive summary General overview One-page summary Meetings Executive meetings Manager meetings Staff meetings Panel discussions Best practice meetings Internal Publications Announcements Bulletins Newsletters Magazines Progress Reports Schedules Preliminary results Memos and
329
Select Media Case Studies Program Brochures Scoreboards
Electronic Media Web sites Video blogs
330
Builds credibility for the process
Impact Study Outline General Information Objectives of study Background Builds credibility for the process Methodology for Impact Study Levels of evaluation ROI Process Collecting data Isolating the effects of the program Converting data to monetary values Costs Assumptions (Guiding Principles)
331
The results with six measures: Levels 1-5 and Intangibles
Impact Study Outline Results General information Response profile Participant reaction Learning Application of skills / knowledge Barriers Enablers Business impact General comments Linkage with business measures Costs ROI calculation Intangible benefits The results with six measures: Levels 1-5 and Intangibles
332
Impact Study Outline Summary of Findings
Conclusions and Recommendations Conclusions Recommendations Exhibits
333
Communicating with Senior Management
Can they take it Do they believe you
334
Purpose of the Meeting Create awareness and understanding of ROI
Build support for the ROI methodology Communicate results of study Drive improvement from results Cultivate effective use of the ROI methodology
335
Meeting Ground Rules Do not distribute the impact study until the end of the meeting Be precise and to the point Avoid jargon and HR speak Spend less time on the lower levels of evaluation data Present the data with a strategy in mind
336
Presentation Sequence
Describe the program and explain why it is being evaluated Present the methodology process Present the reaction and learning data Present the application data List the barriers and enablers to success Address the business impact
337
Presentation Sequence
Show the costs Present the ROI Show the intangibles Review the credibility of the data Summarize the conclusions Present the recommendations
338
Communication Progression
Detailed Study Executive Summary One Page Summary Meeting No Meeting First 2 ROI Studies 3-5 ROI Studies 6 Plus ROI Studies
339
ROI Impact Study: One-Page Summary
Program Title: Preventing Sexual Harassment at Healthcare, Inc. Target Audience: First and Second Level Managers (655) Secondary: All employees through group meetings (6,844) Duration: 1 day, 17 sessions
340
Brief Reports Executive Summary Slide Overview
1-page Summary (see example) Brochure
341
Electronic Reporting Website blogs Video
342
Mass Publications Announcements Bulletins Newsletters Magazines
343
Case Study Internal Use
Communicate results Teach others Build a history Serve as a template Make an impression
344
Case Study External Publication
Provide recognition to participants Improve image of function Enhance brand of department Enhance image of organization
345
Micro Level Scorecard Macro Level Scorecard 1 2 3 4 1 2 3 4
1 2 3 4 1 2 3 4 0 Indicators 1 Reaction 2 Learning 3 Application 4 Impact 5 ROI Intangibles 1 2 3
346
Building a Macro Scorecard
Provides macro-level perspective of success Serves as a brief report versus detailed study Shows connection to business objectives Integrates various types of data Demonstrates alignment between programs, strategic objectives, and operating goals
347
Seven Categories of Data
Indicators Reaction and Planned Action Learning Application Business Impact ROI Intangibles
348
Potential Reporting 0. Indicators Number of Employees Involved
Total Hours of Involvement Hours Per Employee Training investment as a Percent of Payroll Cost Per Participant
349
Potential Reporting I. Reaction and Planned Action
Percent of Programs Evaluated at this Level Ratings on 7 Items vs. Target Percent with Action Plans Percent with ROI Forecast
350
Potential Reporting II. Learning
Percent of Programs Evaluated at This Level Types of Measurements Self Assessment Ratings on 3 Items vs. Targets Pre/Post – Average Differences
351
Potential Reporting III. Application
Percent of Programs Evaluated at This Level Ratings on 3 Items vs. Targets Percent of Action Plans Complete Barriers (List of Top Ten) Enablers (List of Top Ten) Management Support Profile
352
Potential Reporting IV. Business Impact
Percentage of Programs Evaluated at This Level Linkage with Measures (List of Top Ten) Types of Measurement Techniques Types of Methods to Isolate the Effects of Programs Investment Perception
353
Potential Reporting V. ROI Percent of Programs Evaluated at This Level
ROI Summary for Each Study Methods of Converting Data to Monetary Values Fully Loaded Cost Per Participant
354
Potential Reporting Intangibles List of Intangibles (Top Ten)
How Intangibles Were Captured
355
Appropriate Level of Data 1 2 3 4 5
Use of Evaluation Data Appropriate Level of Data Adjust program design Improve program delivery Influence application and impact Enhance reinforcement Improve management support Improve stakeholder satisfaction Recognize and reward participants Justify or enhance budget Develop norms and standards Reduce costs Market programs Expand implementation to other areas
356
Delivering Bad News Never fail to recognize the power to learn and improve with a negative study. Look for red flags along the way. Lower outcome expectations with key stakeholders along the way. Look for data everywhere. Never alter the standards. Remain objective throughout the process. and . . .
357
Delivering Bad News Prepare the team for the bad news.
Consider different scenarios. Find out what went wrong. Adjust the story line to “Now we have data that shows how to make this program more successful.” In an odd sort of way, this becomes a positive spin on less-than-positive data. Drive improvement.
358
Analyze the Results of Communication
Observe reactions Solicit informal feedback Collect formal feedback Monitor blogs Make adjustments
359
ROI Possibilities Pre-Program ROI forecast
End-of-Program ROI forecast with Level 1 Data End-of-Program ROI forecast with Level 2 Data Follow-Up ROI forecast with Level 3 Data Follow-Up ROI evaluation with Level 4 Data
360
ROI at Different Levels
Data Collection Timing - (Relative Cost to ROI with: to the Initiative) Credibility Accuracy Develop Difficulty Least Least Least Least Credible Accurate Expensive Difficult Pre-Program Before Forecast Level 1 Data During Level 2 Data During Level 3 Data After Level 4 Data After Most Most Most Most
361
Pre-Program Forecast ROI Model
TABULATE PROGRAM COSTS ISOLATE THE EFFECTS OF THE PROGRAM CONVERT DATA TO MONETARY VALUES ESTIMATE CHANGE IN DATA CALCULATE THE RETURN ON INVESTMENT IDENTIFY INTANGIBLE BENEFITS
362
Retail Merchandise Company Questions for Discussion
Is a pre-program forecast possible? Which groups should provide input to the forecast?
363
“Expert” Input for Estimate Sales Increase Estimate (Δ) Forecasted ROI
Sales Associates Dept. Managers Store Managers Sr. Executive Analyst Vendor Marketing Analyst Finance Staff Benchmarking Data 0% 5% 10% 15% 12% 25% 4% 2% 9% -100% -30% 33% 110% 95% 350% -40% -80% 22%
364
Retail Merchandise Company Questions for Discussion
Assess the credibility of each “expert” group. Is there any additional information you need? How would you present this to senior management to make a decision to implement the program?
365
Steps for Pre-Program ROI Forecast
Develop Level 3 and 4 objectives, with as many specifics as possible Estimate/Forecast monthly improvement in Level 4 data (ΔP) Convert Level 4 measure to monetary value (V) Develop the estimated annual impact for each measure (ΔPxVx12) Estimate fully-loaded program costs and . . .
366
Steps for Pre-Program ROI Forecast
Calculate the forecasted ROI using the total projected benefits Use sensitivity analysis to develop several potential ROI values with different levels of improvement (ΔP) Identify potential intangible benefits Communicate analysis with caution
367
Steps to Pre-Program Forecast
Measure: Sales Profit Margin: 2% Source Monthly Change Value Annual Change Cost ROI SME $25,000 $500 $6,000 $5,000 20% Vendor $50,000 $1,000 $12,000 140% Participant $30,000 $600 $7,200 44% Supervisor $28,000 $560 $6,720 34%
368
Sensitivity Analysis Potential Sales Increase (Existing Customers)
Potential Complaint Reduction (Monthly Reduction) Expected ROI $25,000 $50,000 $30,000 10 20 30 60% 90% 120% 150% 180%
369
Input to Forecast Previous experience with same or similar programs
Supplier/Designer experience in other situations Estimates from supplier/designer Estimates from SMEs Estimates from client/sponsor Estimates from target participants
370
Forecasting ROI from a Pilot Program
Develop Level 3 and 4 objectives Design/Develop pilot program without the bells and whistles (or use a supplier program) Conduct the program with one or more “typical” groups Develop the ROI using the ROI Process model for Level 4 post-program data Make decision to implement based on results
371
Level 1 Measures Program content Materials Facilitator / coordinator
Relevance / importance Perceived value Amount of new information Recommendation to others Planned improvements Opportunity for forecast
372
Important Questions to Ask on Feedback Questionnaires
Planned Improvements Please indicate what you will do differently on the job as a result of this program 1.________________________________________________________ 2.________________________________________________________ 3.________________________________________________________ As a result of any change in your thinking, new ideas, or planned actions, please estimate (in monetary values) the benefit to your organization (i.e., reduced absenteeism, reduced employee complaints, better teamwork, increased personal effectiveness, etc.) over a period of one year __________________ What is the basis of this estimate?_______________________________________ What confidence, expressed as a percentage, can you put in your estimate? (0%=No Confidence; 100%=Certainty) ____________________%
373
ROI with Level 1 Data At the end of the program, ask participants:
What knowledge or skills have been improved? What actions are planned with the improved knowledge and skills? Which measures will be influenced? What impact, in monetary units, will this improvement have in the work unit? What is the basis for this estimate? What level of confidence do you place on this estimate? Then, compare total “adjusted” benefits with program costs.
374
Sales Increase Estimate
Participant No. Sales Increase Estimate Basis Confidence Level 1 $20,000 Sales 90% 2 $9,000 2 sales per day 80% 3 $50,000 Sales increase 70% 4 $10,000 3 sales daily 60% 5 Millions 4 sales each day 95% 6 $75,000 More sales 100% 7 $7,500 3 more sales 8 $25,000 4 sales – 1 sale 75% 9 $15,000 One more sale 30% 10 2 new sales 11 $45,000 12 $40,000 2 sales each day 13 No increase 14 $150,000 Many new sales 15 Unlimited Additional sales 50% 16 $37,000 More sales and satisfaction
375
Retail Merchandise Company Questions for Discussion
What is your strategy for analyzing this data? How reliable is this data? How could you use this data?
376
Level 2 Evaluation Tests Opportunity for forecast Skill practices
Self reports Exercises Observations during the training program Checklists by facilitator Team assessments
377
ROI with Level 2 Data Develop an end-of-program test that reflects program content Establish a relationship between test data and output performance for participants Predict performance levels of each participant with given test scores Convert performance data to monetary value Compare total predicted value of program with program costs
378
Relationship Between Test Scores and Performance
379
Relationship Between Test Scores and Sales Performance
380
Retail Merchandise Company Questions for Discussion
Calculate the forecasted ROI. How reliable is this estimate of ROI at Level 2? What other issues might need to be considered in this process? Is this information useful? If so, how should the information be used?
381
Projected Benefit $9,698 x .14 x .02 x 48 = $1,303 $1,303 BCR = = 1.9
$687 ROI = 90%
382
Retail Merchandise Company Questions for Discussion
What is the ROI for this program? How credible is this approach to calculating ROI? Could this same approach be used to forecast the value prior to the implementation of the program?
383
ROI Calculation Benefits BCR = = Costs Net Benefits ROI = X 100 =
384
ROI Calculation $3,242 BCR = = 4.72 $687 $3,242 - $687 ROI = X 100 =
372% $687
385
ROI with Level 3 Data Develop competencies for the target job.
Indicate percentage of job success that is covered in the program. Determine monetary value of competencies, using salaries and employee benefits. Compute the worth of pre- and post-program skill levels. Subtract post-program values from pre-program values. Compare the total added benefits with the program costs.
386
Advantages of Forecasting
Increases the usefulness of data collection Focuses attention on business outcomes Monitors the path to success Compares forecast to actual results to improve forecasts
387
Forecasting Realities
If you must forecast, forecast frequently Consider forecasting an essential part of the evaluation mix Forecast different types of data Secure input from those who know the process best Long-term forecasts will usually be inaccurate
388
Forecasting Realities
Expect forecasts to be biased Serious forecasting is hard work Review the success of forecasting routinely The assumptions are the most serious error in forecasting Utility is the most important characteristic of forecasting
389
Barriers to ROI Use and Implementation
After mastering the ROI model, it is appropriate to examine implementation in more detail. Please take a few moments to identify the barriers to implementation. List all the “things” that can prevent a successful implementation. Be candid.
390
Overcoming the Barriers
Now, identify the actions needed to minimize, remove, or go around the barriers. List all the “steps” that need to be taken to overcome the barrier.
391
Implementation Issues
Resources (staffing / budget) Leadership (individual, group, cross functional team) Timing (urgency, activities) Communication (various audiences) Commitment (staff, managers, top executives)
392
Typical Barriers I don’t have time for additional measurement and evaluation. An unsuccessful evaluation will reflect poorly on my performance. A negative ROI will kill my program. My budget will not allow for additional measurement and evaluation. Measurement and evaluation are not part of my job. and . . .
393
Typical Barriers I didn’t have input on this process.
I don’t understand this process. Our managers will not support this process. Data will be misused. The data are too subjective.
394
Building Blocks to Overcome Resistance
Utilizing Shortcuts Monitoring Progress Removing Obstacles Preparing the Management Team Initiating the ROI Projects Tapping into a Network Preparing the Staff Revising Policies and Procedures Establishing Goals and Plans Developing Roles and Responsibilities Assessing the Climate for Measuring ROI
395
Assessing the Climate for Results
Survey staff (team members) Survey staff from management perspective Develop gaps (actual vs. desired) Plan actions
396
Identifying Champions
You cannot do it alone Champions have a passion for accountability Consider a champion from each area Network the champions Recognize the champions
397
Measurement and Evaluation Implementation Project Plan for a Large Petroleum Company
Team formed Jan Policy developed Feb-Apr Targets set Jan-Feb Workshops developed Mar-Jul Application Evaluation Project (A) Apr-Sept Impact Evaluation Project (B) Jun-Jan Impact Evaluation Project (C) Sept-Mar and . . .
398
Measurement and Evaluation Implementation Project Plan for a Large Petroleum Company
ROI Project (D) Nov-Aug Staff trained Aug-Jan Vendors trained Feb-Apr Managers trained May-Aug Support tools developed Apr-May Evaluation guidelines developed Feb-Jun
399
Responsibilities for Champions
Designing data collection instruments Providing assistance for developing an evaluation strategy Analyzing data, including specialized statistical analyses Interpreting results and making specific recommendations and . . .
400
Responsibilities for Champions
Developing an evaluation report or case study to communicate overall results Providing technical support in any phase of measurement and evaluation Assisting in communicating results to key stakeholders
401
Responsibilities for Team Members
Ensure that the needs assessment includes specific business impact measures. Develop application objectives and business impact objectives for each program. Focus the content of the program on the objectives of business performance improvement; ensuring that exercises, case studies, and skill practices relate to the desired objectives. and . . .
402
Responsibilities for Team Members
Keep participants focused on application and impact. Communicate rationale and reasons for evaluation. Assist in follow-up activities to capture business impact data. Provide assistance for data collection, data analysis, and reporting. Design simple instruments and procedures for data collection and analysis. Present evaluation data to a variety of groups.
403
Getting Team Members Involved
Developing plans Establishing responsibilities Designing tools and templates Selecting programs for higher level evaluation Driving changes / improvements
404
Participant Responsibilities
Actively participate Learn what’s needed Apply and implement program Secure results Provide data
405
Conduct Several Studies
Cover a variety of areas Move from simple to complex Mix up Levels 3, 4, and 5 Avoid political issues early in the process
406
Conduct Workshops and Briefings
1 to 1½-hour briefings 1-day workshops 2-day workshops Special topics
407
Creating an ROI Network
Within the organization Within the local area Within the community
408
Typical Network Issues
Communication methods Membership rules Meeting times Topics / Issues Monitoring / Managing
409
Typical Network Topics
Tool / Template sharing Collaborative projects Research / Benchmarking Sounding board Project critiques Technology review
410
Key ROI Issues Time Cost Complexity Accuracy Credibility
Lack of Skills
411
Cost-Saving Approaches to ROI
Plan for evaluation early in the process Build evaluation into the process Share the responsibilities for evaluation Require participants to conduct major steps Use short-cut methods for major steps Use sampling to select the most appropriate programs for ROI analysis and . . .
412
Cost-Saving Approaches to ROI
Use estimates in the collection and analysis of data Develop internal capability to implement the ROI process Utilize web-based software to reduce time Streamline the reporting process
413
Tools and Templates Instruments Costs Analysis Reporting
414
Technology Reaction / Learning surveys Test design Follow-up surveys
Statistics packages ROI software Scorecards
415
Suggested Evaluation Targets
Level Level 1 - Reaction Level 2 - Learning Level 3 - Application Level 4 - Business Impact Level 5 - ROI Target 100% 60% 30% 10-20% 5-10%
416
Worksheet – Project/Program Selection Criteria
List each project/program that fits Level 3 criteria in the left column. Rank each project/program in its category as High Priority (HP), Special Attention (SA), or Business as Usual (BAU). Compliance Project/Program Customer Service Project/Program Sales Program Call Center or other Customer Transaction Program Organization Sponsored Certification Program
417
Level 3 Priority Ranking
High Priority Project/Program clearly must be evaluated at Level 3 in the short term. Special Attention May not be evaluated at Level 3 in the short term, but there are enough issues that an assignment will be made to assess the situation. Business as Usual Continue with current strategy for this program.
418
Worksheet – Project/Program Selection Criteria
List each project/program you are considering evaluating in the left column. Rank each program as 1, 2, 3, 4, or 5 for each of the ten criteria. Life Cycle of Project/ Program Operational Objectives Strategic Objectives Costs Audience Size Visibility Investment of time Needs Assessment Conducted Management Interest Quality of Data Collection Processes
419
Criteria for Selecting Programs for Levels 4 and 5 Evaluation
Life cycle of the solution Linkage of solution to operational goals and issues Importance of solution to strategic objectives Top executives are interested in the evaluation Cost of the solution Visibility of the solution Size of the target audience Investment of time
420
Results-Based Policy Statement
Provides focus for the staff Communicates results-based philosophy Sets goals and targets for evaluation Determines basic requirements Serves as a learning tool
421
Results-Based Policy Key Elements
Purpose / Mission / Direction Evaluation targets Evaluation support group functions Responsibility for results Management review of results Follow-up process Communication of results
422
Evaluation Procedures and Guidelines
Show how to utilize tools and techniques Guide the design process Provide consistency in the process Ensure that the appropriate methods are used Keeps the process on track Place emphasis on the desired areas
423
Management Influence Commitment usually refers to the top management group and includes its pledge or promise to allocate resources. Management support refers to the action of the entire management group and reflects the group’s attitude towards the HR process and staff. and . . .
424
Management Influence Management involvement refers to the extent to which executives and managers are actively engaged in the HR process in addition to participating in the program. Management reinforcement refers to the actions designed to reward and encourage a desired behavior.
425
Why Managers Don’t Support Your Programs
No results Too costly No input No relevance No involvement No time No preparation Lack of knowledge about HR No requirements
426
Management Action Target Group Scope Payoff Commitment Top executives All programs Very high Support Mid managers, 1st Level managers Several programs High Reinforcement 1st Level managers Specific programs Moderate Involvement All levels
427
The Results Commitment Relationship
Top Management Commitment Business Results Successful Programs
428
CEO Commitment Checklist
429
Ten Commitments Develop or approve a mission
Allocate the necessary funds Allow employees time to participate Become actively involved Support the learning effort
430
Ten Commitments Position the function Require evaluation
Insist on cost effectiveness Set an example Create an atmosphere of open communication
431
Why Programs Don’t Work
Immediate manager does not support the program. The culture in the work group does not support the program. No opportunity to use the program. No time to implement the program. Didn’t learn anything that could be applied to the job. The systems and processes did not support the program.
432
Why Programs Don’t Work
Didn’t have the resources available to use the program. Changed job and the program no longer applies. This is not appropriate in our work unit. Didn’t see a need to use the program. Could not change old habits.
433
Questions for Discussion
When considering the situation, what specifically can be done to enhance the program success? How important is the role of the manager of participants in programs? What implication does this have for your programs?
434
The Transfer of Success to the Job
TIMEFRAME Before During After Manager Participant Facilitator/ Organizer ROLE-PLAYERS
435
Ideal Management Support
Gives endorsement and approval for participants to be involved in program. Volunteers personal services or resources to assist in the program’s implementation. Makes a pre-program commitment with the participant concerning expected efforts. Reinforces the behavior change resulting from the program. Conducts a follow-up on program results. Gives positive rewards for participants who experience success with the program.
436
Ideal Reinforcement Helping the participant diagnose problems to determine if the program is needed Discussing possible alternatives to help the participant apply the skills and implement the program Encouraging the participant to implement the program Serving as a role model for the proper use of the skills Providing positive rewards to the participant when the program is successfully implemented
437
Levels of Management Support
Supportive: strongly and actively supports all of our efforts. Responsive: Supports programs, but not as strongly as the supporting manager. Non-Supportive: Privately voices displeasure with our programs on a formal basis. Destructive: Works actively to keep participants from being involved in our programs.
438
ROI: Tools vs. Relationships
Program Developers Program Coordinators Program Facilitators Program Advisors Program Managers Participants Supervisors Managers
439
Types of Management Involvement
As members of advisory committees As members of task forces As subject matter experts As participants As program leaders As evaluators As program sponsors As purchasers of services In a newly-defined role In rotational assignments
440
Potential Manager Involvement
Steps in the Process Opportunity Strategy Conduct Analysis High Taskforce Develop Measurement and Evaluation System Moderate Advisory committee Establish Program Objectives Develop Program Implement Program Program leader
441
Potential Manager Involvement
Steps in the Process Opportunity Strategy Monitor Costs Low Expert input Collect and Analyze Data Moderate Interpret Data and Draw Conclusions High Communicate Results Manager as participant
442
Concerns About HR From a Key Manager
Results are not there This is not my responsibility I don’t have time for HR I don’t understand what you do No respect for HR
443
Managers Workshop Objectives
After completing this workshop, each manager should: See the results of HR. Understand his or her responsibility for HR. Identify areas for personal involvement in the HR process. Develop specific behaviors to support and reinforce program objectives. Realize the importance of the HR function in achieving departmental, division, and company goals.
444
Steps to Develop a Partnership
Assess the current status of partnership relationships. Identify key individuals for a partnership relationship. Learn the business. Consider a written plan. Offer assistance to solve problems. Show results of programs. Publicize partners’ accomplishments and successes.
445
Steps to Develop a Partnership
Ask the partner to review needs. Have partner serve on an advisory committee. Shift responsibility to the partner. Invite input from the partner about key plans and programs. Ask the partner to review program objectives, content, and delivery mechanisms. Invite the partner to conduct or coordinate a program or portion of a program. Review progress and re-plan strategy.
446
Partnering Principles
Have patience and persistence throughout the process. Follow win-win opportunities for both parties. Deal with problems and conflicts quickly. Share information regularly and purposefully. Always be honest and display the utmost integrity in all the transactions.
447
Partnering Principles
Keep high standards of professionalism in each interaction. Give credit and recognition to the partner routinely. Take every opportunity to explain, inform, and educate. Involve managers in as many activities as possible.
448
Annual HR Review Agenda
Review of previous year’s HR programs Methods / levels of evaluation Results achieved from programs Significant deviations from expected results Basis for determining HR needs for next year Scheduled programs Proposed methods / levels of evaluation Potential payoffs Problem areas in the HR process Concerns from top management
449
Action Plan for Improvement
Develop a plan of implementation for improving measurement and evaluation in your organization. Consider all of the items included in this and other modules. Identify a particular time frame and key responsibilities.
450
Needs assessment/analysis Objectives Reaction measures
Issue Actions Time Responsibility Perception of HR Needs assessment/analysis Objectives Reaction measures Learning measures Application measures Impact measures ROI measures Use of technology Communicating results Management influence Staff development Roles / responsibilities
451
International Car Rental
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.