Presentation on theme: "Online Course Design Online Course Design EVALUATION AND REVISION"— Presentation transcript:
1 Online Course Design Online Course Design EVALUATION AND REVISION JENNIFER FREEMAN
2 Session GoalsUnderstand the difference between assessment and evaluationDefine formative, summative and confirmative evaluation and understand the importance of eachExplore theories and methods of evaluationCreate a course evaluation and revision plan
3 Evaluation vs. Assessment Evaluation: measuring the quality and effectiveness of learning materials and activitiesAssessment: measuring students’ learning and achievement of goals and objectives
4 What Do We Evaluate? Objectives and alignment Quality of instructional materials, resources, strategiesTools and technologyTesting instrumentsEvaluation chartMorrison, Ross and Kemp (2004)Evaluate objectives: are we teaching and assessing what we said we would?Evaluate content: everything from no typos present to factually correct content to aesthetically pleasing materialsEvaluate instructional support: are students getting the support they need? (library, help desk, tutorials) Is external content reliable and of good quality?Evaluate the teaching strategies: did that group project work out the way we would thought it would?Evaluate the tools: how easy were they for students to use? Did the chat tool work? Was the flash exercise too complicated?Evaluate communication: was the level of interaction and communication appropriate?Instructional materials’ alignment with objectives and the effectiveness of testing instrumentsQuality of instructional materialsQuality of external resourcesEffectiveness of instructional strategiesUsability of tools and technologyEffectiveness of teaching skillsEvaluation chart
5 Formative Evaluation of Instructional Materials Why?Uncover problems early on; fix broken stuffDiscover potential usability/accessibility issuesExamine effectiveness and improve functionalityDynamic nature of online learningFind all of the typos, errors, broken links, gaps in contentTest for problems that may arise for students with disabilitiesWhat looked good on the storyboard may not work well in practiceNew technologies and teaching theories surface every day in a field as you as online learning
6 Formative Evaluation of Instructional Materials What? When?An ongoing process, usually done both during development and while being taughtAsks the question, “How are we doing?”Find all of the typos, errors, broken links, gaps in contentTest for problems that may arise for students with disabilitiesWhat looked good on the storyboard may not work well in practiceNew technologies and teaching theories surface every day in a field as you as online learning
7 Who will use this evaluation information? How? Formative EvaluationWho will use this evaluation information?How?What should be evaluated?Who will use this evaluation information?Course development teamInstructorHow? What should be evaluated?Instructional materialsInstructional strategiesUse of tools and technologyKeep the audience in mind when designing your evaluation plan and writing the questions you will ask…who will be interested in the evaluation feedback at this stage?Probably will have limited student feedback on instructional support at this point
8 Formative Evaluation: Questions to Ask Do learning activities and assessments align with the learning objectives?Do learning materials meet quality standards?Are the technology tools appropriate and working properly?Do learning activities and assessments align with the learning objectives?Do learning materials meet quality standards?Are learning materials error-free?Are learning materials accessible?Are learning materials usable?Are the technology tools appropriate and working properly?Are you able to draw a clear correlation to a learning goal for each course activity? Is each learning objective represented by content, activities and assessment?Focus on the instructional materialsFocus on the tools – test them – are they working and will students be able to use them properly? Are further instructions or documentation needed for any tool being used? (LMS messaging, chat, discussion areas, assignment tool, tests, etc.)
9 Formative Evaluation: Gathering data Course development rubricsChecklistsFocus group feedbackconsistency and repeatabilityHelp desk error logsStudentsFaculty notesCourse development rubricsChecklistsFocus group feedbackFor consistency and repeatabilityTo reduce bias, evaluation should not be done by members of the course development team…you need a “fresh pair of eyes”Develop a testing process/protocol/list of questions/evaluation formsDetermine a standardized, set time for testing so it occurs regularly as a normal part of the processHelp desk error logsStudent FAQ discussion threads“Extra credit for errors found” ideaFaculty notes jotted down during semesterFacilitator note to self: Focus group feedback (think eportfolio project) and Q/A reviewsUse of rubrics and checkpoints assigned to an experienced ID/developer who didn’t work on the courseFocus group of potential users should test every part of the course (freelance reviewers)During the first semester a course is being taught, keep a close watch on the error logs from the help desk. Keep track of problem areas and note improvements to be madeLet your students help!
13 Sample Formative Evaluation and Revision Plan Checkpoint #1 – syllabus, outline and first lessonCheckpoint #2 – half of the course, viewed on multiple platformsCheckpoint #3 – Entire course proofread/editedCheckpoint #4 – Entire course tech reviewedCheckpoint #5 – Final check (previous errors)Student survey after first three lessonsInstructor survey after first three lessonsExamination of help desk error logsDuring developmentShow sample checkpoint rubrics, checklists and surveysNEED TO IDENTIFY THESECourse is Live
15 Summative Evaluation Why? What? When? Why? What? When? Examine effectivenessImprove functionalityDiscover causes for failuresConstant maintenance and improvementsWhat? When?Usually done after the completion of each semesterAsks the question, “How did we do?”Why?Examine effectivenessImprove functionalityDiscover causes for failures…fix existing problemsWhat works in theory doesn’t always work in practiceConstant maintenance and improvements to content and strategies…the dynamic nature of online learningWhat? When?Usually done after the completion of each semesterAsks the question, “How did we do?”First semester is complete, now have student data and feedback available to identify problems not discovered earlier. “The first semester is like the first pancake.”Dynamic nature of teaching with technology: what improvements have been made that we can take advantage of? What new research is available?
16 Summative Evaluation Who will use this evaluation information? How? What should be evaluated?Who will use this evaluation information?InstructorCourse development teamAdministrationHow? What should be evaluated?effectiveness of instructional materials and strategiesthe learning environmentthe instructor’s teaching skillsavailability and ease of use of tools and technologyinstructor satisfaction with the online teaching experiencestudent satisfaction with the online learning experienceKeep the audience in mind…what questions are they interested in having answered?In addition to the areas evaluated during the formative stage, we can now take a look at the areas measured by student success and feedback: the “feel” of the learning environment, the instructor’s skill in teaching online, the usability of the content and teaching tools; as well as the instructor’s opinions on the experience
17 Summative Evaluation: Questions to Ask Student successgradeslearning activities , assignments and assessmentstimeInstructor and student satisfactionlearning materialsmotivationtoolsAre program/department needs being metaccreditations, prerequisites for other courses, competenciesIs the course scalable?Did the students succeed? (grades)Did the learning activities and assessments align with the learning objectives?Were assignments and assessment appropriate to the content?Was time adequate to convey material and complete tasks?Level of instructor and student satisfaction (participation and opinion)Were learning materials easy to use and accessible? What content did students frequently have problems with? What areas of the course are error-prone?Were there any concerns about motivation?What tools did the instructor or students frequently have problems with? Should we continue to use chosen tools?Are program/department needs being metaccreditations, prerequisites for other courses, competenciesIs the course scaleable?How were students grades?How was the level of participation, according to student opinion? According to instructor opinion? Compared with other courses?How did students feel about the assignments and assessments? How did the instructor feel? What was the rate of success/completion of the assignments and assessments compared to other courses?What were the instructor’s and students’ opinions about time requirements? (This is a big one. Almost every online course attempts to do too much the first time around.)Other student opinions: content? Environment? Level of communication? Tools/technology?Re-visit the help desk logs…where were the problems?
18 Summative Evaluation: Gathering data Student gradesStudent surveysInstructor satisfaction surveysLearner self-assessmentsPretest/posttest comparisonsAssessment item analysisFocus group feedbackHelp desk error logsDiscussion forum and chat archives
21 Sample Summative Evaluation and Revision Plan Analyze surveys; identify themes or trendsAnalyze assessment resultsAnalyze help desk logsExamine course archivesCompile list of issuesResearch solutionsAssign priority ratingsAssign tasks and establish deadlinesAnalyze student and faculty surveys; identify themes or trendsAnalyze assessmentAnalyze help desk logsExamine course archivesCompile list of issues (including issues noted during formative phase that have yet to be addressed)Research solutionsDetermine time needed to fixAssign priority ratingsAssign tasks and establish deadlinesWill vary, depending onhow often, soon course is to be offered againresources available (release time/extra staff less likely to be forthcoming than during initial development)May have to limit revisions to urgent issues until time/resources are available
22 Confirmative Evaluation Why?long-term effectivenesslarge-scale issuesWhat? When?after the completion of each semester“How are we doing now?”Why?Discover long-term effectiveness of the courseAddress large-scale changes necessary to the curriculumConstant maintenance and improvements to technology, content and strategies…the dynamic nature of online learningWhat? When?Usually done some time after the completion of each semesterAsks the question, “How are we doing now?”Curriculum appropriate within department (prerequisites for other courses, fit, etc.)?Continue using LMS?Budgetary
23 Confirmative Evaluation Who will use the evaluation information?How?What is being evaluated?Who will use the evaluation information?InstructorAdministrationHow? What is being evaluated?Students’ long-term retention of learning, usefulness to their long-term goalsLong term effectiveness of the course within the programLMS and other technology/tools
24 Confirmative Evaluation: Questions to Ask program/department needstrends in student satisfactionscalability/sustainabilitytools and technologyAre program/department needs being met (accreditations, prerequisites for other courses, competencies)Trends in level of student satisfactionCourse valuable/meaningful to long-term goals? (program/career)Is the course scalable?Is the course sustainable?Learning environment, technology, tools still meeting our needs?LMS evaluationMajor course redesign
25 Confirmative Evaluation: Gathering data Program student surveysDepartmental administrative opinionsFaculty peer review of learning materialsEmployer surveysRetention dataHelp desk logsLMS effectiveness study / survey
26 Common LMS Evaluation Criteria costserver space adequatemaintenance needs being metvendor/software standardsreliabilitysecuritycustomizationcourse structure and presentationCosts rising at a reasonable rate?Are server space and maintenance needs being met? Is the vendor and software in compliance with required standards?How reliable has the system been?Have there been any security concerns?Level of customization possible within the system?Satisfied with the structure and presentation of courses?Satisfied with the authoring tools provided?Satisfied with the tracking capabilities of the system?Satisfied with the testing engine and/or assessment tools available in the system?available in the system?• Are faculty, students and staff satisfied with the collaboration tools (discussion areas,journaling, help desk, whiteboard) provided through the system?• Are faculty, students and staff satisfied with the productivity tools (calendar, help files,search engine) provided through the system?• Is student / faculty / staff documentation or training sufficient?• How usable do students, faculty and staff find the tools?• What is the vendor’s reputation in the industry?• What is the vendor’s position in the industry?
27 Common LMS Evaluation Criteria trackingassessmentcollaboration toolsproductivity toolsdocumentation and trainingvendor reputation and positionSatisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system?Satisfied with the productivity tools (calendar, help files, search engine) provided through the system?Is student / faculty / staff documentation or training sufficient?How usable do students, faculty and staff find the tools?What is the vendor’s reputation in the industry?What is the vendor’s position in the industry?
30 Evaluation Activity is it accessible? is it scaleable? is it sustainable?Questions will be asked and discussed for each learning object/activity idea on the flip chart. If the answer to any question is no, ideas for improvement will be discussed.