Presentation is loading. Please wait.

Presentation is loading. Please wait.

Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com.

Similar presentations


Presentation on theme: "Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com."— Presentation transcript:

1

2 Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com Associate Professor, Indiana University http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu Dr. Vanessa Paz Dennen Assistant Professor, San Diego State University vdennen@mail.sdsu.edu http://edweb.sdsu.edu/people/vdennen

3 Workshop Overview Part I: The State of Online Learning Part II. Evaluation Purposes, Approaches, and Frameworks Part III. Applying Kirkpatrick’s 4 Levels Part IV. ROI and Online Learning Part V. Collecting Evaluation Data & Online Evaluation Tools (Time: 8:30-11:30; 12:30-3:30)

4 Part I. The State of Online Learning Survey of Corporate Settings What’s Going On? And How Are We Evaluating It?

5 Free Corporate Reports 1.Corporate E-Learning : Exploring a New Frontier, Hambrecht and Co. (2000, March) http://www.wrhambrecht.com/research/coverage/elearnin g/ir/ir_explore.pdf (95 pages) 2.Training Magazine Special Issue, September 2000, 37(9), The State of Online Learning 3.Fortune Special Issue, 142(13), Nov. 27, 2000, Special Insert: E-learning strategies for executive education and corporate training. http://www.fortuneelearning.com/topics/

6 Survey of 201 Trainers, Instructors, Managers, Instructional Designers, CEOs, CLOs, etc.

7 Among the Key Goals 1.To identify the resources, tools, and activities desired in e-learning. 2.To document gaps between tools and resources deemed useful and actual use. 3.To survey commitment to e-learning. 4.To document practices related to e-learning training and support. 5.To document pedagogical practices and motivational techniques supported in e- learning.

8 Survey Limitations Sample pool—e-PostDirect The Web is changing rapidly Lengthy survey, low response rate No password or keycode Many backgrounds—hard to generalize Does not address all issues (e.g., ROI calculations, how trained & supported, specific assessments)

9

10

11

12

13 Primary Job Function 84% = Training (e.g., trainers, training managers, training directors, or training evaluators) –30% Instructors or Trainers –27% Training Managers –20% Training Evaluators –14% Training Directors 45% = Instructional Designers & Program Devel. 5% = Human Resources; 5% Performance Managers; and 4% CLOs

14 Categorized Job Titles 26% Trainers, Educators, or Instructors 20% Managers (e.g., Training, IT Programs, Instructional Designers, or Quality Assurance) 19% Directors (Director of Corp Education, E- Learning, Professional Development, etc.) 13% Instructional Designers or Technologists 13% High Ranking Administrators (CEO, President, CLO, CTO) 9% Consultants

15 Professional Reading Interests 80% read magazines or journals related to e-learning. Nearly 100% read training related publications

16

17

18

19

20 Why Interested in E-Learning?  Mainly cost savings  Reduced travel time  Greater flexibility in delivery  Timeliness of training  Better allocation of resources, speed of delivery, convenience, course customization, lifelong learning options, personal growth, greater distrib of materials

21 Why Interested in E-Learning?  “Exploit the technology to deliver our intellectual capital.”  “Reduce time to learn, reduce time to productivity.”  “Cost reduction (write once, publish on different platforms).”  “Invest less in expensive trips to train for 3 days without apparent results.”

22

23 Blended Approach Is Most Common Ganzel, May 2001, Online learning Magazine

24 Corporate Web Integration Continuum Level 1: Blended course—self-paced Level 2: Entire course online--self-paced Level 3: Tutored or mentored course Level 4: Blended course—instructor led Level 5: Entire course online-synchronous Level 6: Entire course online-asynchronous Level 7: Entire course online-sync and asynchronous Level 8: Certificate program online Level 9: Degree online Level 10: Corporate university online

25

26

27 Current Courseware System Negatives  “Slow development time.”  “Not interactive.”  “Low interactivity, boring.”  “…lack of bookmarking, tracking, evaluation, etc.”  “Don’t support the instructional design process—are course management systems.”  “XYZ,…, presents obstacles in moving course content from one server to another.”

28 Current Courseware System: Negative and Positive  “…does provide a number of excellent features, yet development time is very clumsy…it is not very intuitive.”  “XYZ is powerful and intuitive. It is not always reliable.”  “Fairly reliable, but not always. At times have had to stop training and go back to the beginning to start again as it seizes up.”  “From a cost posture, they are, quite simply, unbeatable. Limitations: Can’t save whiteboard presentations developed in virtual classroom.”

29 Current Courseware System: Positives  “It is comprehensive, scalable, and intuitive.”  “…seems to be flexible.”  “XYZ is simple to use & clean in design.”  “modify to suit individual course needs.”  “It’s reasonably inexpensive, there is a Web-based template to design customized courses…easily added to existing courseware.

30 Delivery System  17% developed own systems or tools  15% did not know what system they were using  30% used Internet application tools (e.g., Designer’s Edge, Dreamweaver, Authorware)  35% used presentation tools (e.g., Astound, WebEx)  Many used existing courseware systems and tools (e.g., WebBoard, Learning Space)

31 What Vendors Select & Why? Standardization vs. Innovation Standard Tool Advantages: Training easier, jump started, common framework, fixed costs Disadvantages: Tools do not fit all needs, need technical training, lose control

32 Web-Based Content  Capella  Click 2 Learn  Colleges/Universities  Digital Think  Docent, Inc.  Eduprise  Element K  eMind.com  eSocrates  ExecuTrain  Freeskills.com  Headlight.com  Jones International University  KnowledgeNet  Knowledge Planet  Mentergy--includes LearnLinc products  Microsoft Training and Service  Netg  Prime Learning  Saba  Smart Force  ThinQ (i.e., Trainingnet)  TrainSeek  Vcampus  Viviance New Education  Walden Univ./Institute

33

34

35 Why Evaluate? Cost-savings –Becoming less important reason to evaluate as more people recognize that the initial expense is balanced by long-term financial benefits Performance improvement –A clear place to see impact of online learning Competency advancement

36 Pause: How are costs calculated in online programs?

37 The Cost of E-learning Brandon-hall.com estimates that an LMS system for 8,000 learners costs $550,000 This price doesn’t include the cost of buying or developing content Bottom line: getting started in e-learning isn’t cheap

38 Evaluation Process Can be likened to ADDIE instructional design model –ANALYSIS is needed to determine a purpose of the evaluation –A DESIGN is needed to guide the process –Instruments must be DEVELOPED –Without IMPLEMENTATION you have no data –In the end, the data are analyzed, and EVALUATED

39 A Few Assessment Comments

40 Level 1 Comments. Reactions “We assess our courses based on participation levels and online surveys after course completion. All of our courses are asynchronous.” “I conduct a post course survey of course material, delivery methods and mode, and instructor effectiveness. I look for suggestions and modify each course based on the results of the survey.” “We use the Halo Survey process of asking them when the course is concluding.”

41 Level 2 Comments: Learning “We use online testing and simulation frequently for testing student knowledge.” “Do multiple choice exams after each section of the course.” “We use online exams and use level 2 evaluation forms.”

42 Level 3 Comment: Job Performance “I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…”

43 More Assessment Comments Multiple Level Evaluation “Using Level One Evaluations for each session followed by a summary evaluation. Thirty days post-training, conversations occur with learners’ managers to assess Level 2” (actually Level 3).” “We do Level 1 measurements to gauge student reactions to online training using an online evaluation form. We do Level 2 measurements to determine whether or not learning has occurred… “Currently, we are using online teaching and following up with manager assessments that the instructional material is being put to use on the job.”

44 Who is Evaluating Online Learning? 59% of respondents said they did not have a formal evaluation program At Reaction level: 79% At Learning level: 61% At Behavior/Job Performance level: 47% At Results or Return on Investment: 30%

45

46 Assessment Lacking or Too Early “We are just beginning to use Web-based technology for education of both associates and customers, and do not have the metric to measure our success. However, we are putting together a focus group to determine what to measure (and) how.” “We have no online evaluation for students at this time.” “We lack useful tools in this area.”

47 Limitations with Current System “I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…” “We are looking for better ways to track learner progress, learner satisfaction, and retention of material.” “Have had fairly poor ratings on reliability, customer support, and interactivity…”

48 Pause…How and What Do You Evaluate…?

49 What else did the corporate training survey show?

50

51

52

53

54

55

56

57

58 Sample Reasons for Obstacles “Skepticism on the benefits within the Healthcare environment.” “Ignorance about the advantages of using the Internet to save money.” “Generation gap and bias against anything not face to face.” “Poor support from IT managers to support organizational goals.” “Lack of foresight in the industry/no ability to see the big pic!”

59

60 Just Why is Bandwidth So Darn Important???

61

62 Obstacles: Technology Comments “Lack of hardware to efficiently use Web-based technology.” “Systems infrastructure.” “Huge diversity in hardware.” “Reliable Web access of our training audiences.” “Caught up in the tech not the instruction!”

63

64

65 Obstacles: Problems in Delivery Methods “Students needs hands on.” “High rate of change in IT materials—never mature.” “Effectiveness of this method.” “Some courses are better delivered in traditional classrooms.”

66

67

68

69

70

71

72

73

74

75

76

77 Issues Raised in Survey Increases in Web instruction anticipated Better tools needed Perceived high cost Need clearer vision & manage support Lots of money being spent Low course completion rates Limited organizational support

78 So, any questions about the state of things?

79 What do we need??? Part II Evaluation Purposes, Approaches and Frameworks

80 One Area in Need of Frameworks is Evaluation of Online Learning

81 What is Evaluation??? “Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach). ( Champagne & Wisher, in press)

82 But who are the evaluators? The level of evaluation will depend on articulation of the stakeholders. Stakeholders of evaluation in corporate settings may range from…???

83 What is assessment? Assessment refers to…efforts to obtain info about how and what students are learning in order to improve…teaching efforts and/or to demo to others the degree to which students have accomplished the learning goals for a course.” (Millar, 2001, p. 11). It is a way of using info obtained through various types of measurement to determine a learner’s performance or skill on some task or situation (Rosenkrans, 2000).

84 Why Evaluate?

85 Evaluation Purposes Assessing learner progress –What did they learn? Assessing learning impact –How well do learners use what they learned? –How much do learners use what they learn?

86 Evaluation Purposes Efficiency –Was online learning more effective than another medium? –Was online learning more cost-effective than another medium/what was the return on investment (ROI)? Improvement –How do we do this better?

87 Evaluation Purposes An evaluation plan can evaluate the delivery of e-learning, identify ways to improve the online delivery of it, and justify the investment in the online training package, program, or initiative (Champagne & Wisher, in press).

88 Evaluation Purposes Evaluation can help quantify the return on investment allowing one to compare the costs of acquiring, developing, and implementing e-learning to actual savings, revenue impact, and other competitive advantages that are translatable into monetary values.

89 Contextual Factors Learner progress, impact of training and efficiency all may be affected by other contextual factors Contextual factors unique to online learning: –Technology breakdowns –Inadequate computer systems (learners can’t access multimedia components -- and don’t know that they’re missing anything)

90 Evaluation Plans Does your company have a training evaluation plan?

91 Formal Evaluation Programs Most training evaluation data are not used for evaluation or performance improvement purposes. Why? There is no plan for using the data and no one has the time. Why does it matter in online learning? Need to be sure that the development expense is justified.

92 Steps to Developing an OL Evaluation Program Select a purpose and framework Develop benchmarks Develop online survey instruments –For learner reactions –For learner post-training performance –For manager post-training reactions Develop data analysis and management plan

93 What Are Your Evaluation Questions? What does your employer want to know about online learning’s impact? How interested is your employer in evaluation results?

94 Formative Evaluation Formative evaluations focus on improving the online learning experience. A formative focus will try to find out what worked or did not work. Formative evaluation is particularly useful for examining instructional design and instructor performance.

95 Formative Questions -How can we improve our OL program? -How can we make our OL program more efficient? -More effective? -More accessible?

96 Summative Evaluation Summative evaluations focus on the overall success of the OL experience (should it be continued?). A summative focus will look at whether or not objectives are met, the training is cost-effective, etc.

97 What Can OL Evaluation Measure? Categories of Evaluation Info (Woodley and Kirkwood, 1986).Measures of activity.Measures of efficiency.Measures of outcomes.Measures of program aims.Measures of policy.Measures of organizations

98 Typical Evaluation Frameworks for OL Commonly used frameworks include: –CIPP Model –Objectives-oriented –Marshall & Shriver’s 5 levels –Kirkpatrick’s 4 levels Plus a 5th level –AEIOU –Consumer-oriented

99 CIPP Model Evaluation CIPP is a management-oriented model –C = context –I = input –P = process –P = product Examines the OL within its larger system/context

100 CIPP & OL: Context Context: Addresses the environment in which OL takes place. How does the real environment compare to the ideal? Uncovers systemic problems that may dampen OL success.

101 CIPP & OL: Input Input: Examines what resources are put into OL. Is the content right? Have we used the right combination of media? Uncovers instructional design issues.

102 CIPP & OL: Process Process: Examines how well the implementation works. Did the course run smoothly? Were there technology problems? Was the facilitation and participation as planned? Uncovers implementation issues.

103 CIPP & OL: Product Product: Addresses outcomes of the learning. Did the learners learn? How do you know? Does the online training have an effect on workflow or productivity? Uncovers systemic problems.

104 Objectives-Oriented Evaluation Examines OL training objectives as compared to training results Helps determine if objectives are being met Helps determine if objectives, as formally stated, are appropriate Objectives can be used as a comparative benchmark between online and other training methods

105 Evaluating Objectives & OL An objectives-oriented approach can examine two levels of objectives: –Instructional objectives for learners (did the learners learn?) –Systemic objectives for training (did the training solve the problem?)

106 Objectives & OL Requires: –A clear sense of what the objectives are (always a good idea anyway) –The ability to measure whether or not objectives are met Some objectives may be implicit and hard to state Some objectives are not easy to measure

107 Marshall & Shriver's 5 Levels of Evaluation Performance-based evaluation framework Each level examines a different area’s of performance Requires demonstration of learning

108 Marshall & Shriver's 5 Levels Level I: Self (instructor) Level II: Course Materials Level II: Course Curriculum Level IV: Course Modules Level V: Learning Transfer

109 Kirkpatrick’s 4 Levels A common training framework. Examines training on 4 levels. Not all 4 levels have to be included in a given evaluation.

110 The 4 Levels Reaction Learning Behavior Results

111 A 5th Level Return on Investment is a 5th level It is related to results, but is more clearly stated as a financial calculation How to calculate ROI is the big issue here

112 Is ROI the answer ? Elise Olding of CLK Strategies suggests that we shift from looking at ROI to looking at time to competency. ROI may be easier to calculate since concrete dollars are involved, but time to competency may be more meaningful in terms of actual impact.

113 Example: Call Center Training Traditional call center training can take 3 months to complete Call center employees typically quit within one year When OL was implemented, the time to train (time to competency) was reduced Benchmarks for success: time per call; number of transfers

114 Example: Circuit City Circuit City provided online product/sales training What is more useful to know: –The overall ROI or break-even point? –How much employees liked the training? –How many employees completed the training? –That employees who completed 80% of the training saw an average increase of 10% in sales?

115 A 6th Level? Clark Aldrich (2002) Adding Level 6 which relates to the budget and stability of the e-learning team. –Just how respected and successful is the e-learning team. –Have they won approval from senior management for their initiatives. –Aldrich, C. (2002). Measuring success: In a post-Maslow/Kirkpatrick world, which metrics matter? Online Learning, 6(2), 30 & 32.

116 And Even a 7 th Level? Clark Aldrich (2002) At Level 7 whether the e-learning sponsor(s) or champion(s) are promoted in the organization. While both of these additional levels address the people involved in the e-learning initiative or plan, such recognitions will likely hinge on the results of evaluation of the other five levels.

117 ROI Alternative: Cost/Benefit Analysis (CBA) ROI may be ill-advised since not all impacts hit bottom line, and those that do take time. Shifts the attention from more long-term results and quantifying impacts with numeric values, such as: –increased revenue streams, –increased employee retention, or –reduction in calls to a support center. Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

118 Cost/Benefit Analysis (CBA) To both qualitative and quantitative measures: –job satisfaction ratings, –new uses of technology, –reduction in processing errors, –quicker reactions to customer requests, –reduction in customer call rerouting, –increased customer satisfaction, –enhanced employee perceptions of training, –global post-test availability. Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

119 Cost/Benefit Analysis (CBA) In effect, CBA asks how does the sum of the benefits compare to the sum of the costs. Yet, it often leads to or supports ROI and other more quantitatively-oriented calculations. Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

120 Other ROI Alternatives Time to competency (need benchmarks) –online databases of frequently asked questions can help employees in call centers learn skills more quickly and without requiring temporary leaves from their position for such training Time to market –might be measured by how e-learning speeds up the training of sales and technical support personnel, thereby expediting the delivery of a software product to the market Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20- 22, & 24.

121 Still Other ROI Alternatives Return on Expectation 1.Asks employees a series of q’s related to how training met expectations of their job performance. 2.When q’ing is complete, they place a $ figure on that. 3.Correlate or compare such reaction data with business results or supplement Level 1 data to include more pertinent info about the applicability of learning to employee present job situation. –Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20-22, & 24.

122 AEIOU Provides a framework for looking at different aspects of an online learning program Fortune & Keith, 1992; Sweeney, 1995; Sorensen, 1996

123 A = Accountability Did the training do what it set out to do? Data can be collected through –Administrative records –Counts of training programs (# of attendees, # of offerings) –Interviews or surveys of training staff

124 E = Effectiveness Is everyone satisfied? –Learners –Instructors –Managers Were the learning objectives met?

125 I = Impact Did the training make a difference? Like Kirkpatrick’s level 4 (Results)

126 O = Organizational Context Did the organization’s structures and policies support or hinder the training? Does the training meet the organization’s needs? OC evaluation can help find when there is a mismatch between the training design and the organization Important when using third-party training or content

127 U = Unintended Consequences Unintended consequences are often overlooked in training evaluation May give you an opportunity to brag about something wonderful that happened Typically discovered via qualitative data (anecdotes, interviews, open-ended survey responses)

128 Consumer-Oriented Evaluation Uses a consumer point-of-view –Can be a part of vendor selection process –Can be a learner-satisfaction issue Relies on benchmarks for comparison of different products or different learning media

129 What About Evaluation Issues in Higher Education???

130 My Evaluation Plan…

131 What to Evaluate? 1.Student —attitudes, learning, jobs. 2.Instructor —popularity, survival. 3.Training —effectiveness, integratedness. 4.Task --relevance, interactivity, collab. 5.Tool -- usable, learner-centered, friendly, supportive. 6.Course —interactivity, completion. 7.Program —growth, model(s), time to build. 8.University —cost-benefit, policies, vision.

132 1.Measures of Student Success (Focus groups, interviews, observations, surveys, exams, records) Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester

133 1. Student Basic Quantitative Grades, Achievement Number of Posts Participated Computer Log Activity—peak usage, messages/day, time of task or in system Attitude Surveys

134 1. Student High-End Success Message complexity, depth, interactivity, q’ing Collaboration skills Problem finding/solving and critical thinking Challenging and debating others Case-based reasoning, critical thinking measures Portfolios, performances, PBL activities

135 Focus of Assessment? 1.Basic Knowledge, Concepts, Ideas 2.Higher-Order Thinking Skills, Problem Solving, Communication, Teamwork 3.Both of Above!!! 4.Other…

136 Assessments Possible Online Portfolios of Work Discussion/Forum Participation Online Mentoring Weekly Reflections Tasks Attempted or Completed, Usage, etc.

137 More Possible Assessments Quizzes and Tests Peer Feedback and Responsiveness Cases and Problems Group Work Web Resource Explorations & Evaluations

138 Increasing Cheating Online ($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?) http://www.academictermpapers.com/ http://www.termpapers-on-file.com/ http://www.nocheaters.com/ http://www.cheathouse.com/uk/index.html http://www.realpapers.com/ http://www.pinkmonkey.com/ (“you’ll never buy Cliffnotes again”)

139

140

141

142 Reducing Cheating Online Ask yourself, why are they cheating? Do they value the assignment? Are tasks relevant and challenging? What happens to the task after submitted—reused, woven in, posted? Due at end of term? Real audience? Look at pedagogy b4 calling plagiarism police!

143 Reducing Cheating Online Proctored exams Vary items in exam Make course too hard to cheat Try Plagiarism.com ($300) Use mastery learning for some tasks Random selection of items for item pool Use test passwords, rely on IP# screening Assign collaborative tasks

144 Reducing Cheating Online ($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?) http://www.plagiarism.org/ (resource) http://www.turnitin.com/ (software, $100, free 30 day demo/trial) http://www.canexus.com/ (software; essay verification engine, $19.95) http://www.plagiserve.com/ (free database of 70,000 student term papers & cliff notes) http://www.academicintegrity.org/ (assoc.) http://sja.ucdavis.edu/avoid.htm (guide)

145

146 Turnitin Testimonials "Many of my students believe that if they do not submit their essays, I will not discover their plagiarism. I will often type a paragraph or two of their work in myself if I suspect plagiarism. Every time, there was a "hit." Many students were successful plagiarists in high school. A service like this is needed to teach them that such practices are no longer acceptable and certainly not ethical!”

147 Part III: Applying Kirkpatrick’s 4 Levels to Online Learning Evaluation & Evaluation Design

148 Why Use the 4 Levels? They are familiar and understood Highly referenced in the training literature Can be used with 2 delivery media for comparative results

149 Conducting 4-Level Evaluation You need not use every level –Choose the level that is most appropriate to your need and budget Higher levels will be more costly and difficult to evaluate Higher levels will yield more

150 Kirkpatrick Level 1: Reaction Typically involves “Smile sheets” or end-of-training evaluation forms. Easy to collect, but not always very useful. Reaction-level data on online courses has been found to correlate with ability to apply learning to the job. Survey ideally should be Web-based, keeping the medium the same as the course.

151 Kirkpatrick Level I: Reaction Types of questions: –Enjoyable? –Easy to use? –How was the instructor? –How was the technology? –Was it fast or slow enough?

152 Kirkpatrick Level 2: Learning Typically involves testing learners immediately following the training Not difficult to do, but online testing has its own challenges –Did the learner take the test on his/her own?

153 Kirkpatrick Level 2: Learning Higher-order thinking skills (problem solving, analysis, synthesis) Basic skills (articulate ideas in writing) Company perspectives and values (teamwork, commitment to quality, etc.) Personal development

154 Kirkpatrick Level 2: Learning Might include: –Essay tests. –Problem solving exercises. –Interviews. –Written or verbal tests to assess cognitive skills. Shepard, C. (1999b, July). Evaluating online learning. TACTIX from Fastrak Consulting. Retrieved February 10, 2002, from: http://fastrak- consulting.co.uk/tactix/Features/evaluate/eval01.htm.

155 Kirkpatrick Level 3: Behavior More difficult to evaluate than Levels 1 & 2 Looks at whether learners can apply what they learned (does the training change their behavior?) Requires post-training follow-up to determine Less common than levels 1 & 2 in practice

156 Kirkpatrick Level 3: Behavior Might include: –Direct observation by supervisors or coaches (Wisher, Curnow, & Drenth, 2001). –Questionnaires completed by peers, supervisors, and subordinates related to work performance. –On the job behaviors, automatically logged performances, or self-report data. Shepard, C. (1999b, July). Evaluating online learning. TACTIX from Fastrak Consulting. Retrieved February 10, 2002, from: http://fastrak-consulting.co.uk/tactix/Features/evaluate/eval01.htm.

157 Kirkpatrick Level 4: Results Often compared to return on investment (ROI) In e-learning, it is believed that the increased cost of course development ultimately is offset by the lesser cost of training implementation A new way of training may require a new way of measuring impact

158 Kirkpatrick Level 4: Results Might Include: –Labor savings (e.g., reduced duplication of effort or faster access to needed information). –Production increases (faster turnover of inventory, forms processed, accounts opened, etc.). –Direct cost savings (e.g., reduced cost per project, lowered overhead costs, reduction of bad debts, etc.). –Quality improvements (e.g., fewer accidents, less defects, etc.). Horton, W. (2001). Evaluating e-learning. Alexandria, VA: American Society for Training & Development.

159 Kirkpatrick + Evaluation Design Kirkpatrick’s 4 Levels may be achieved via various evaluation designs Different designs help answer different questions

160 Pre/Post Control Groups One group receives OL training and one does not As variation try 3 groups –No training (control) –Traditional training –OL training Recommended because it may help neutralize contextual factors Relies on random assignment as much as possible

161 Multiple Baselines Can be used for a program that is rolling out Each group serves as a control group for the previous group Look for improvement in subsequent groups Eliminates need for tight control of control group

162 Time Series Looks at benchmarks before and after training Practical and cost-effective Not considered as rigorous as other designs because it doesn’t control for contextual factors

163 Single Group Pre/Post Easy and inexpensive Criticized for lack of rigor (absence of control) Needs to be pushed into Kirkpatrick levels 3 and 4 to see if there has been impact

164 Case Study A rigorous design in academic practice, but often after-the-fact in corporate settings Useful when no preliminary or baseline data have been collected

165 Part IV: ROI and Online Learning

166 The Importance of ROI OL requires a great amount of $$ and other resources up front It gives the promise of financial rewards later on ROI is of great interest because of the investment and the wait period before the return

167 Calculating ROI Look at: –Hard cost savings –Hard revenue impact –Soft competitive benefits –Soft benefits to individuals See: Calculating the Return on Your eLearning Investment (2000) by Docent, Inc.

168 Possible ROI Objectives Better Efficiencies Greater Profitability Increased Sales Fewer Injuries on the Job Less Time off Work Faster Time to Competency

169 Hard Cost Savings Travel Facilities Printed material costs (printing, distribution, storage) Reduction of costs of business through increased efficiency Instructor fees (sometimes)

170 Hard Revenue Impact Consider –Opportunity cost of improperly or untrained personnel –Shorter time to productivity through shorter training times with OL –Increased time on job (no travel time) –Ease of delivering same training to partners and customers (for fee?)

171 Soft Competitive Benefits Just-in-time capabilities Consistency in delivery Certification of knowledge transfer Ability to track users and gather data easily Increase morale from simultaneous roll-out at different sites

172 Individual Values Less wasted time Support available as needed Motivation from being treated as an individual

173 Talking about ROI As a percentage –ROI=[(Payback- Investment)/Investment]*100 As a ratio –ROI=Return/Investment As time to break even –Break even time=(Investment/Return)*Time Period

174 What is ROI Good For? Prioritizing Investment Ensuring Adequate Financial Support for OL Project Comparing Vendors

175 The Changing Face of ROI “Return-on-investment isn’t what it used to be … The R is no longer the famous bottom line and the I is more likely a subscription fee than a one-time payment” (Cross, 2001)

176 More Calculations Total Admin Costs of Former Program - Total Admin Costs of OL Program =Projected Net Savings Total Cost of Training/# of Students =Cost Per Student (CPS) Total Benefits * 100/Total Program Cost =ROI%

177 At the End of the Day... Are all training results quantifiable? NO! Putting a price tag on some costs and benefits can be very difficult NO! Some data may not have much meaning at face value –What if more courses are offered and annual student training hours drop simultaneously? Is this bad?

178 Part V: Collecting Evaluation Data & Online Evaluation Tools

179 Collecting Evaluation Data Learner Reaction Learner Achievement Learner Job Performance Manager Reaction Productivity Benchmarks

180 Forms of Evaluation Interviews Focus Groups Self-Analysis Supervisor Ratings Surveys and Questionnaires ROI Document Analysis Data Mining (Changes in pre and post- training; e.g., sales, productivity)

181 How Collect Data? Direct Observation in Work Setting –By supervisor, co-workers, subordinates, clients Collect Data By Surveys, Interviews, Focus Groups –Supervisors, Co-workers, Subordinates, Clients Self-Report by learners or teams

182 Learner Data Online surveys are the most effective way to collect online learner reactions Learner performance data can be collected via online tests –Pre and post-tests can be used to measure learning gains Learner post-course performance data can be used for Level 3 evaluation –May look at on-the-job performance –May require data collection from managers

183 Example: Naval Phys. Training Follow-Up Evaluation A naval training unit uses an online survey/database system to track performance of recently trained physiologists Learner’s self-report performance Managers report on learner performance Unit heads report on overall productivity

184 Learning System Data Many statistics are available, but which are useful? –Number of course accesses –Log-in times/days –Time spent accessing course components –Frequency of access for particular components –Quizzes completed and quiz scores –Learner contributions to discussion (if applicable)

185 Learner System Data IF learners are being evaluated based on number and length of accesses, it is only fair that they be told Much time can be wasted analyzing statistics that don’t tell much about the actual impact of the training Bottom line: Easy data to collect, but not always useful for evaluation purposes –Still useful for management purposes

186 Benchmark Data Companies need to develop benchmarks for measuring performance improvement Managers typically know the job areas that need performance improvement Both pre-training and post-training data need to be collected and compared Must also look for other contextual factors

187 Online Testing Tools (see: http://www.indiana.edu/~best/)

188

189

190 Test Selection Criteria (Hezel, 1999) Easy to Configure Items and Test Handle Symbols Scheduling of Feedback (immediate?) Provides Clear Input of Dates for Exam Easy to Pick Items for Randomizing Randomize Answers Within a Question Weighting of Answer Options

191 More Test Selection Criteria Recording of Multiple Submissions Timed Tests Comprehensive Statistics Summarize in Portfolio and/or Gradebook Confirmation of Test Submission

192 More Test Selection Criteria (Perry & Colon, 2001) Supports multiple items types—multiple choice, true-false, essay, keyword Can easily modify or delete items Incorporate graphic or audio elements? Control over number of times students can submit an activity or test Provides feedback for each response

193 Flexible scoring—score first, last, or average submission Flexible reporting—by individual or by item and cross tabulations. Outputs data for further analysis Provides item analysis statistics (e.g., Test Item Frequency Distributions). More Test Selection Criteria (Perry & Colon, 2001)

194 Computer Log Data Chen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log portfolio for managing classroom processes by applying decision tree and data cute tech. Journal of Educ Computing Research, 23(3), 305-332. Determine student behavior patterns –student posting opinions, –asking questions, –replying to opinions, –posting articles, etc. Web logs can also help instructors make informed pedagogical decisions. For instance, does a particular teaching strategy or task improve student interaction?

195 Computer Log Data Chen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log portfolio for managing classroom processes by applying decision tree and data cute tech. Journal of Educ Computing Research, 23(3), 305-332. In a corp training situation, computer log data can correlate online course completions with: –actual job performance improvements such as fewer violations of safety regulations, reduced product defects, increased sales, and timely call responses.

196 Email and Chat Chats and email messages might provide data about the effectiveness of the training event.

197 Online Survey Tools for Assessment

198 Sample Survey Tools Zoomerang (http://www.zoomerang.com) IOTA Solutions (http://www.iotasolutions.com) QuestionMark (http://www.questionmark.com/home.html) SurveyShare (http://SurveyShare.com; from Courseshare.com) Survey Solutions from Perseus (http://www.perseusdevelopment.com/fromsurv.htm) Infopoll (http://www.infopoll.com)

199

200

201 Survey Tool Features Maintain email lists and email invitations Conduct polls Adaptive branching and cross tabulations Modifiable templates Maintain library of past surveys Publish reports Technical support, chat advice Different types of accounts—hosted, corporate, professional, etc.

202

203

204

205

206

207 Web-Based Survey Advantages Faster collection of data Standardized collection format Computer graphics may reduce fatigue Computer controlled branching and skip sections Easy to answer clicking Wider distribution of respondents

208 Web-Based Survey Problems: Why Lower Response Rates? Low response rate Lack of time Unclear instructions Too lengthy Too many steps Can’t find URL Perceived as aggressive

209 Web-Based Survey Solutions: Some Tips… Send second request Make URL link prominent Offer incentives near top of request Shorten survey, make attractive, easy to read Credible sponsorship—e.g., university Disclose purpose, use, and privacy E-mail cover letters Prenotify of intent to survey

210 Tips on Authentification Check e-mail access against list Use password access Provide keycode, PIN, or ID # (Futuristic Other: Palm Print, fingerprint, voice recognition, iris scanning, facial scanning, handwriting recognition, picture ID)

211 Some Final Advice…

212 As venture capital drys up and state funding is cut, evaluation and accountability takes center stage in e-learning decision- making and discussion.

213 Questions? Comments? Concerns?


Download ppt "Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com."

Similar presentations


Ads by Google