Formative and Summative Evaluations

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Online Course Design Online Course Design EVALUATION AND REVISION
Gwinnett Teacher Effectiveness System Training
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Collecting data Chapter 5
An Introduction to Instructional Design Online Learning Institute Mary Ellen Bornak Instructional Designer Bucks County Community College.
Summative Evaluation The Evaluation after implementation.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Effective Implementation Formative and Summative Project Evaluation.
Instructional System Design.  The purpose of instructional design is to maximize the value of instruction for the learner especially the learner's time.
Unit 10: Evaluating Training and Return on Investment 2009.
Chapter 41 Training for Organizations Research Skills.
The Academic Assessment Process
Needs Analysis Instructor: Dr. Mavis Shang
PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.
Grade 12 Subject Specific Ministry Training Sessions
Chapter 13 Survey Designs
Instructional Design Dr. Lam TECM 5180.
Assessing and Evaluating Learning
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
La Naturaleza.  The generic term for the five-phase instructional design model consisting of Analysis, Design, Development, Implementation, and Evaluation.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Developing Evaluation Instruments
Instructional System Design
Principles of Assessment
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Revising instructional materials
Questionnaires and Interviews
Instructional Design Aldo Prado. Instructional Design Instructional design is the process of easing the acquisition of knowledge and making it more efficient.
Instructional System Design
10/08/05Slide 1 Instructional Systems Design in Distance Education Goal: This lesson will discuss Instructional Systems Design as it relates to distance.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Chapter 12: Survey Designs
Using Assessment (and other) Data to Improve Instruction.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
By: Catherine Mendoza. Evaluate Implement Develop Analyze Design.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
DESIGNING, IMPLEMENTING, AND EVALUATING CAREER DEVELOPMENT PROGRAMS AND SERVICES Career Development Interventions in the 21 st Century 4 th Edition Spencer.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
Assessment and Testing
Mohammad Alipour Islamic Azad University, Ahvaz Branch.
The Curriculum Development Process Dr. M
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Evaluation, Testing and Assessment June 9, Curriculum Evaluation Necessary to determine – How the program works – How successfully it works – Whether.
Program Evaluation Making sure instruction works..
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Barry Williams1 Designing & Conducting Formative Evaluation Dick & Carey Chp. 10.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
COM 535, S08 Designing and Conducting Formative Evaluations April 7, 2008.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Adjunct Training – August 2016 | Jason Anderson
Chapter 4 Instructional Media and Technologies for Learning
Designing & Conducting Formative Evaluation
Student Learning Outcomes Assessment
Presentation transcript:

Formative and Summative Evaluations Instructional Design For Multimedia

Evaluation Phases Formative Evaluation Implementation Summative Evaluation Analysis Design Development Formative Evaluation

Formative Evaluation Occurs before implementation Determines the weaknesses in the instruction so that revisions can be made Makes instruction more effective and efficient

Formative Evaluation Is Especially Important When… Designer is novice Content area is new Technology is new Audience is unfamiliar Task performance is critical Accountability is high Client requests/expects evaluation Instructions will be disseminated widely Opportunities for later revision are slim

Formative Evaluation Phases Design Reviews Expert Reviews One-to-One Evaluation Learner Validation Small-Group Evaluation Field Trials Ongoing Evaluation

Design Reviews Should take place after each step of the design process Goal Review Review of Environment and Learner Analysis Review of Task Analysis Review of Assessment Specifications

Design Reviews Goal Review Question Does the instructional goal reflect a satisfactory response to the problems identified in the needs assessment? Possible Methods Have client review and approve learning goals

Design Reviews Environment and Learner Analysis Review Question Do the environment and learner analyses accurately portray these entities? Possible Methods Collect survey or aptitude data Give a reading test to sample learners Survey managers to confirm attitudes

Design Reviews Task Analysis Review Question Does the task analysis include all of the prerequisite skills and knowledge needed to perform the learning goal, and is the prerequisite nature of these skills and knowledge accurately represented? Possible Methods Test groups of learners with and without prerequisite skills Give a pretest to a sample audience on the skills to be learned

Design Reviews Assessment Specification Review Question Does the test items and resultant test blueprints reflect reliable and valid measures of the instructional objectives? Possible Methods Have experts review assessment items Administer assessment instruments to skilled learners to determine practicality

Expert Reviews Should take place when instructional materials are in draft form Experts include: Content Experts Instructional Design Experts Content-specific Educational Experts Learner Experts

Expert Reviews Content Experts Subject matter experts (SMEs) review for accuracy and completeness Is the content accurate and up-to-date? Does the content present a consistent perspective? Example: Physics expert

Expert Reviews Instructional Design Experts Reviews for instructional strategy and theory Is the instructional strategies consistent with principles of instructional theory? Example: Instructional Designer

Expert Reviews Content-Specific Educational Expert Reviews for pedagogical approach in content area Is the pedagogical approach consistent with current instructional theory in the content area? Example: Science education specialist

Expert Reviews Learner Expert Reviews appropriateness such as vocabulary, examples and illustrations Are the examples, practice exercises, and feedback realistic and accurate? Is the instruction appropriate for target learners? Example: 6th grade teacher

Expert Reviews Process Distribute draft material to experts Collect comments and prioritize into categories such as: Critical Revisions should be made immediately Non-critical Disregard or address at a later date More Info Find more data or information

Learner Validation Try instruction with representative learners to see how well they learn and what problems arise as they engage with the instruction One-to-One Evaluations Small Group Evaluation Field Trials

Learner Validation One-to-One Evaluation Present materials to one learner at a time Typical problems that might arise are: Typographical errors Unclear sentences Poor or missing directions Inappropriate examples Unfamiliar vocabulary Mislabeled pages or illustrations Make revisions to instruction Conduct more evaluations if necessary

Learner Validation One-to-One Evaluation Process Present materials to student Watch student interact with material Employ “Read-Think-Aloud” method Continually query students about problems they face and what they are thinking Assure student that problems in the instruction are not their fault Tape record or take notes during session Reward participation

Learner Validation Small Group Evaluation Present materials to 8-12 learners Administer a questionnaire to obtain general demographic data and attitudes or experiences Problems that might arise are: Students have more or less entry level skills than anticipated Course was too long or too short Learners react negatively to the instruction Make revisions to instruction Conduct more evaluations if necessary

Learner Validation Small-Group Evaluation Process Conduct entry-level and pretests with students Present instruction to students in natural setting Observe students interacting with materials Take notes and/or videotape session Only intervene when instruction cannot proceed without assistance Administer posttest Administer attitudinal survey or discussion Reward participation

Learner Validation Field Trials Evaluation Administer instruction to 30 students Problems that might arise: Instruction is not implemented as designed Students have more or less entry-level skills Assessments are too easy or difficult Course is too long or too short Students react negatively to instruction Make revisions Conduct more field trials if necessary

Learner Validation Field Trials Evaluation Process Administer instruction students in normal setting, in various regions and with varying socioeconomic status Collect and analyze data from pretests and posttests Conduct follow-up interviews if necessary Conduct questionnaire with instructors who deliver the training

Formative Evaluation Ongoing Evaluation Continue to collect and analyze data Collect all comments/changes made by teachers who deliver the instruction Keep track of changes in learner population Revise instruction or produce new material to accompany instruction as needed

Formative Evaluation Summary Conduct design reviews after each stage of design including goals, environment and learner analysis, task analysis and assessment specifications Conduct expert reviews with content, instructional design, content-specific educator and learner experts Conduct one-to-one evaluations with students Conduct small-group evaluations with 8-12 students Conduct field trials with 30 or more students Conduct ongoing evaluations

Summative Evaluation Occurs after implementation (after program has completed full cycle) Determines the effectiveness, appeal, and efficiency of instruction Assesses whether the instruction adequately solves the “problem” that was identified in the needs assessment

Summative Evaluation Phases Determine Goals Select Orientation Select Design Design/Select Evaluation Measure Collect Data Analyze Data Report Results

Summative Evaluation Determine Goals Identify questions that should be answered as a result of the evaluation Does implementation of the instruction solve the problem identified in the assessment? Do the learners achieve the goals of the instruction? How do the learners feel about the instruction? What are the costs of the instruction, what is the return on investment (ROI)? How much time does it take for learners to complete the instruction? Is the instruction implemented as designed? What unexpected outcomes result from the instruction?

Summative Evaluation Determine Goals Select indicators of success If program is successful, what will we observe it in: Instructional materials? Learner’s activities? Teachers knowledge, practice and attitudes? Learner’s understanding, processes, skills, and attitudes?

Summative Evaluation Select Orientation Come to an agreement with client on most appropriate orientation of evaluation Objectivism – Observation and quantitative data collected to determine the degree to which the goals of the instruction have been met Subjectivism – Expert judgment and qualitative data not based on instructional goals

Summative Evaluation Select Design of Evaluation What data will be collected, when, and under what conditions? Instruction, Posttest Pretest, Instruction, Posttest Pretest, Instruction, Posttest, Posttest, Posttest

Summative Evaluation Design or Select Evaluation Measures Payoff Outcomes - Review statistics that may have changed after instruction was implemented Learning Outcomes - Measure for an increase in test scores Attitudes - Conduct interviews, questionnaires, and observations Level of Implementation - Compare design of program to how it is implemented Costs - Examine costs to implement and continue program, personnel, facilities, equipment, and material

Summative Evaluation Collect Data Devise a plan for the collection of data that includes a schedule of data collection periods

Summative Evaluation Analyze Data Analyze the data so that it is easy for the client to see how the instructional program affected the problem presented in the needs assessment.

Summative Evaluation Report Results Prepare a report of the summative evaluation findings that includes: Summary Background Description of Evaluation Study Results Discussion Conclusion and Recommendations

Summative Evaluation Summary Determine the goals of the evaluation Select objective or subjective orientation Select design of evaluation plan Design or select evaluation measures Collect the data Analyze the data Report the results