Presentation is loading. Please wait.

Presentation is loading. Please wait.

Online Course Design Online Course Design EVALUATION AND REVISION

Similar presentations


Presentation on theme: "Online Course Design Online Course Design EVALUATION AND REVISION"— Presentation transcript:

1 Online Course Design Online Course Design EVALUATION AND REVISION
JENNIFER FREEMAN

2 Session Goals Understand the difference between assessment and evaluation Define formative, summative and confirmative evaluation and understand the importance of each Explore theories and methods of evaluation Create a course evaluation and revision plan

3 Evaluation vs. Assessment
Evaluation: measuring the quality and effectiveness of learning materials and activities Assessment: measuring students’ learning and achievement of goals and objectives

4 What Do We Evaluate? Objectives and alignment
Quality of instructional materials, resources, strategies Tools and technology Testing instruments Evaluation chart Morrison, Ross and Kemp (2004) Evaluate objectives: are we teaching and assessing what we said we would? Evaluate content: everything from no typos present to factually correct content to aesthetically pleasing materials Evaluate instructional support: are students getting the support they need? (library, help desk, tutorials) Is external content reliable and of good quality? Evaluate the teaching strategies: did that group project work out the way we would thought it would? Evaluate the tools: how easy were they for students to use? Did the chat tool work? Was the flash exercise too complicated? Evaluate communication: was the level of interaction and communication appropriate? Instructional materials’ alignment with objectives and the effectiveness of testing instruments Quality of instructional materials Quality of external resources Effectiveness of instructional strategies Usability of tools and technology Effectiveness of teaching skills Evaluation chart

5 Formative Evaluation of Instructional Materials
Why? Uncover problems early on; fix broken stuff Discover potential usability/accessibility issues Examine effectiveness and improve functionality Dynamic nature of online learning Find all of the typos, errors, broken links, gaps in content Test for problems that may arise for students with disabilities What looked good on the storyboard may not work well in practice New technologies and teaching theories surface every day in a field as you as online learning

6 Formative Evaluation of Instructional Materials
What? When? An ongoing process, usually done both during development and while being taught Asks the question, “How are we doing?” Find all of the typos, errors, broken links, gaps in content Test for problems that may arise for students with disabilities What looked good on the storyboard may not work well in practice New technologies and teaching theories surface every day in a field as you as online learning

7 Who will use this evaluation information? How?
Formative Evaluation Who will use this evaluation information? How? What should be evaluated? Who will use this evaluation information? Course development team Instructor How? What should be evaluated? Instructional materials Instructional strategies Use of tools and technology Keep the audience in mind when designing your evaluation plan and writing the questions you will ask…who will be interested in the evaluation feedback at this stage? Probably will have limited student feedback on instructional support at this point

8 Formative Evaluation: Questions to Ask
Do learning activities and assessments align with the learning objectives? Do learning materials meet quality standards? Are the technology tools appropriate and working properly? Do learning activities and assessments align with the learning objectives? Do learning materials meet quality standards? Are learning materials error-free? Are learning materials accessible? Are learning materials usable? Are the technology tools appropriate and working properly? Are you able to draw a clear correlation to a learning goal for each course activity? Is each learning objective represented by content, activities and assessment? Focus on the instructional materials Focus on the tools – test them – are they working and will students be able to use them properly? Are further instructions or documentation needed for any tool being used? (LMS messaging, chat, discussion areas, assignment tool, tests, etc.)

9 Formative Evaluation: Gathering data
Course development rubrics Checklists Focus group feedback consistency and repeatability Help desk error logs Students Faculty notes Course development rubrics Checklists Focus group feedback For consistency and repeatability To reduce bias, evaluation should not be done by members of the course development team…you need a “fresh pair of eyes” Develop a testing process/protocol/list of questions/evaluation forms Determine a standardized, set time for testing so it occurs regularly as a normal part of the process Help desk error logs Student FAQ discussion threads “Extra credit for errors found” idea Faculty notes jotted down during semester Facilitator note to self: Focus group feedback (think eportfolio project) and Q/A reviews Use of rubrics and checkpoints assigned to an experienced ID/developer who didn’t work on the course Focus group of potential users should test every part of the course (freelance reviewers) During the first semester a course is being taught, keep a close watch on the error logs from the help desk. Keep track of problem areas and note improvements to be made Let your students help!

10

11

12

13 Sample Formative Evaluation and Revision Plan
Checkpoint #1 – syllabus, outline and first lesson Checkpoint #2 – half of the course, viewed on multiple platforms Checkpoint #3 – Entire course proofread/edited Checkpoint #4 – Entire course tech reviewed Checkpoint #5 – Final check (previous errors) Student survey after first three lessons Instructor survey after first three lessons Examination of help desk error logs During development Show sample checkpoint rubrics, checklists and surveys NEED TO IDENTIFY THESE Course is Live

14 Sample Formative Evaluation and Revision Plan
Analysis of problems found Assign each issue a priority score Establish a threshold Prioritized list of change requests Assign corrections Make note of unaddressed issues Analysis of problems found How urgent is it? How long will it take to fix? Assign each issue a priority score Establish a threshold below which a course will be postponed Prioritized list of change requests…when’s the best time to revise? Assign corrections and establish a deadline for each Make note of unaddressed issues URGENT – will immediately affect a student’s grade (a broken test that must be taken by midnight) MAJOR – important content is unavailable (javascript mouseover function isn’t working correctly) MINOR – a typo in the instructor’s bio

15 Summative Evaluation Why? What? When? Why? What? When?
Examine effectiveness Improve functionality Discover causes for failures Constant maintenance and improvements What? When? Usually done after the completion of each semester Asks the question, “How did we do?” Why? Examine effectiveness Improve functionality Discover causes for failures…fix existing problems What works in theory doesn’t always work in practice Constant maintenance and improvements to content and strategies…the dynamic nature of online learning What? When? Usually done after the completion of each semester Asks the question, “How did we do?” First semester is complete, now have student data and feedback available to identify problems not discovered earlier. “The first semester is like the first pancake.” Dynamic nature of teaching with technology: what improvements have been made that we can take advantage of? What new research is available?

16 Summative Evaluation Who will use this evaluation information? How?
What should be evaluated? Who will use this evaluation information? Instructor Course development team Administration How? What should be evaluated? effectiveness of instructional materials and strategies the learning environment the instructor’s teaching skills availability and ease of use of tools and technology instructor satisfaction with the online teaching experience student satisfaction with the online learning experience Keep the audience in mind…what questions are they interested in having answered? In addition to the areas evaluated during the formative stage, we can now take a look at the areas measured by student success and feedback: the “feel” of the learning environment, the instructor’s skill in teaching online, the usability of the content and teaching tools; as well as the instructor’s opinions on the experience

17 Summative Evaluation: Questions to Ask
Student success grades learning activities , assignments and assessments time Instructor and student satisfaction learning materials motivation tools Are program/department needs being met accreditations, prerequisites for other courses, competencies Is the course scalable? Did the students succeed? (grades) Did the learning activities and assessments align with the learning objectives? Were assignments and assessment appropriate to the content? Was time adequate to convey material and complete tasks? Level of instructor and student satisfaction (participation and opinion) Were learning materials easy to use and accessible? What content did students frequently have problems with? What areas of the course are error-prone? Were there any concerns about motivation? What tools did the instructor or students frequently have problems with? Should we continue to use chosen tools? Are program/department needs being met accreditations, prerequisites for other courses, competencies Is the course scaleable? How were students grades? How was the level of participation, according to student opinion? According to instructor opinion? Compared with other courses? How did students feel about the assignments and assessments? How did the instructor feel? What was the rate of success/completion of the assignments and assessments compared to other courses? What were the instructor’s and students’ opinions about time requirements? (This is a big one. Almost every online course attempts to do too much the first time around.) Other student opinions: content? Environment? Level of communication? Tools/technology? Re-visit the help desk logs…where were the problems?

18 Summative Evaluation: Gathering data
Student grades Student surveys Instructor satisfaction surveys Learner self-assessments Pretest/posttest comparisons Assessment item analysis Focus group feedback Help desk error logs Discussion forum and chat archives

19

20

21 Sample Summative Evaluation and Revision Plan
Analyze surveys; identify themes or trends Analyze assessment results Analyze help desk logs Examine course archives Compile list of issues Research solutions Assign priority ratings Assign tasks and establish deadlines Analyze student and faculty surveys; identify themes or trends Analyze assessment Analyze help desk logs Examine course archives Compile list of issues (including issues noted during formative phase that have yet to be addressed) Research solutions Determine time needed to fix Assign priority ratings Assign tasks and establish deadlines Will vary, depending on how often, soon course is to be offered again resources available (release time/extra staff less likely to be forthcoming than during initial development) May have to limit revisions to urgent issues until time/resources are available

22 Confirmative Evaluation
Why? long-term effectiveness large-scale issues What? When? after the completion of each semester “How are we doing now?” Why? Discover long-term effectiveness of the course Address large-scale changes necessary to the curriculum Constant maintenance and improvements to technology, content and strategies…the dynamic nature of online learning What? When? Usually done some time after the completion of each semester Asks the question, “How are we doing now?” Curriculum appropriate within department (prerequisites for other courses, fit, etc.)? Continue using LMS? Budgetary

23 Confirmative Evaluation
Who will use the evaluation information? How? What is being evaluated? Who will use the evaluation information? Instructor Administration How? What is being evaluated? Students’ long-term retention of learning, usefulness to their long-term goals Long term effectiveness of the course within the program LMS and other technology/tools

24 Confirmative Evaluation: Questions to Ask
program/department needs trends in student satisfaction scalability/sustainability tools and technology Are program/department needs being met (accreditations, prerequisites for other courses, competencies) Trends in level of student satisfaction Course valuable/meaningful to long-term goals? (program/career) Is the course scalable? Is the course sustainable? Learning environment, technology, tools still meeting our needs? LMS evaluation Major course redesign

25 Confirmative Evaluation: Gathering data
Program student surveys Departmental administrative opinions Faculty peer review of learning materials Employer surveys Retention data Help desk logs LMS effectiveness study / survey

26 Common LMS Evaluation Criteria
cost server space adequate maintenance needs being met vendor/software standards reliability security customization course structure and presentation Costs rising at a reasonable rate? Are server space and maintenance needs being met? Is the vendor and software in compliance with required standards? How reliable has the system been? Have there been any security concerns? Level of customization possible within the system? Satisfied with the structure and presentation of courses? Satisfied with the authoring tools provided? Satisfied with the tracking capabilities of the system? Satisfied with the testing engine and/or assessment tools available in the system? available in the system? • Are faculty, students and staff satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? • Are faculty, students and staff satisfied with the productivity tools (calendar, help files, search engine) provided through the system? • Is student / faculty / staff documentation or training sufficient? • How usable do students, faculty and staff find the tools? • What is the vendor’s reputation in the industry? • What is the vendor’s position in the industry?

27 Common LMS Evaluation Criteria
tracking assessment collaboration tools productivity tools documentation and training vendor reputation and position Satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? Satisfied with the productivity tools (calendar, help files, search engine) provided through the system? Is student / faculty / staff documentation or training sufficient? How usable do students, faculty and staff find the tools? What is the vendor’s reputation in the industry? What is the vendor’s position in the industry?

28

29

30 Evaluation Activity is it accessible? is it scaleable?
is it sustainable? Questions will be asked and discussed for each learning object/activity idea on the flip chart. If the answer to any question is no, ideas for improvement will be discussed.

31 Jennifer Freeman jenni.z.freeman@gmail.com


Download ppt "Online Course Design Online Course Design EVALUATION AND REVISION"

Similar presentations


Ads by Google