Summative Evaluation The Evaluation after implementation.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Review: Introduction Define Evaluation
Marketing Research and Information Systems
Designing an Effective Evaluation Strategy
Effective Implementation Formative and Summative Project Evaluation.
Program Evaluation.
8. Evidence-based management Step 3: Critical appraisal of studies
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Formative and Summative Evaluations
SUNITA RAI PRINCIPAL KV AJNI
PPA 503 – The Public Policy Making Process
WRITING A RESEARCH PROPOSAL
6. Penilaian Kurikulum The Meaning of Evaluation 1.Evaluation is a process or group of processes by which evaluators gather data in order to make decisions.
Revising instructional materials
Nature and Scope of Marketing Research
TECHNOLOGY INTEGRATION & INSTRUCTION FOR THE 21 ST CENTURY LEARNER JUNE 15-17, 2009 HOPE BROWN, HIGH SCHOOL SCIENCE, ST. EDMOND, FORT DODGE VALERIE JERGENS,
Chapter 9 Qualitative Data Analysis Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Qualitative Research.
Program Evaluation Using qualitative & qualitative methods.
Marketing Research  Def. - Formal communication link with the environment to provide accurate and useful information for better decision making.  Systematic.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Descriptive and Causal Research Designs
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
Evaluating a Research Report
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Quantitative and Qualitative Approaches
CHAPTER 1 Understanding RESEARCH
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
THE USE OF FLIPPED CLASSROOMS IN AN EFL COURSE 4A0C0044 Coco 李慈恩.
Designing an Experiment PAGE Essential Question How do you conduct scientific inquiry?
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Corporate Learning Course Seminar 3.5 Planning and Decision Making.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
In-House Memo Reports Reporting to Colleagues The Cain Project in Engineering and Professional Communication ENGINEERING SERIES.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
1 Classroom Assessment Compiled by Linda Blocker.
Barry Williams1 Designing & Conducting Formative Evaluation Dick & Carey Chp. 10.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Research Design Overview Goal: To provide a brief overview of the types of research conducted in the fields of education and nursing as a review for students.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
CHAPTER 4: HOW PSYCHOLOGISTS USE THE SCIENTIFIC METHOD.
Designing Effective Evaluation Strategies for Outreach Programs
Part Two.
Reliability and Validity in Research
Identifying Inquiry and Stating the Problem
Qualitative vs. Quantitative research
Quantitative vs. Qualitative Research Method Issues
Explaining the Methodology : steps to take and content to include
Chapter 1: Introduction to Research on Physical Activity
Analyzing Learners & Context
Review: Introduction Define Evaluation
Designing & Conducting Formative Evaluation
Presentation transcript:

Summative Evaluation The Evaluation after implementation

Involves _______ data CollectingAnalyzingSummarizing

For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction

Effectiveness of Content  Instruction solve the problem?  Criterion created prior to evaluation?  Was the criterion established in conjunction with the needs assessment?

Specifically  Did learners achieve the objectives?  Learners feeling about instruction?  What were the costs?  How much time did it take?  Was instruction implemented as designed?  What unexpected outcomes?

Alternative Approaches to Summative Evaluation ObjectivismSubjectivism

Objectivism  Based on empiricism  Answering questions on the bases of observed data  Goal based and replicable, uses the scientific method

Subjectivism  Employs expert judgment  Includes qualitative methods  observation and interviews  evaluate content  “Goal Free”  evaluators haven’t a clue about the goals

Objectivism (limitations)  Examine only a limited number of factors  May miss critical effects

Subjectivism (limitations)  Are not replicable  Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation  May miss critical effects

Designer Role in Summative Evaluation? Somewhat controversial

Timing of Summative Evaluation? Not in the first cycle

Summary Diagram  Formative  Design Reviews  Expert Reviews  One-to-one Eval.  Small Group Eval.  Field Trials  Ongoing Eval.  Summative  Determine Goals of the Evaluation  Select Orientation  Select Design  Design or Select Evaluation Measures  Collect Data  Analyze Data  Report Results

Goals of the Evaluation  What decisions must be made?  What are the best questions?  How practical is it to gather data?  Who wants the answer to a question?  How much uncertainty?

Orientation of Evaluation  Goal-based or goal-free  A middle ground?  Quantitative or qualitative appropriate?  Experimental or naturalistic approach?

Select Design of Evaluation  Describes what data to collect  When the data will be collected  And under what conditions  Issues to consider:  How much confidence must we have that the instruction caused the learning? (internal validity)  How important is the generalizability? (external validity)  How much control do we have over the instructional situation?

Design or Select Evaluation Measures  Payoff outcomes  Is the problem solved?  Costs avoided  Increased outputs  Improved quality  Improved efficiency

Design or Select Evaluation Measures (2)  Learning Outcomes  Use instrument you’ve already developed for the summative evaluation  But measure the entire program

Design or Select Evaluation Measures (3)  Attitudes  Rarely the primary payoff goals  Ask about learner attitudes toward  learning  instructional materials  subject matter  Indices of appeal  attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement

Design or Select Evaluation Measures (4)  Level of Implementation  degree to which the instruction was implemented  Costs  Cost-feasibility  Cost-effectiveness

Alternative Designs  Instruction then posttest  Pretest then instruction then posttest

The Report  Summary  Background  Needs assessment, audience, context, program description  Description of evaluation study  Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost- effectiveness info., analysis of unintentional outcomes

The Report (continued)  Results  Outcomes, implementation, cost- effectiveness info., unintentional outcomes  Discussion  causal relationship between program & results  Limitation of study  Conclusion & Recommendations

Summary  Summative evaluation is after implementation  Limitations of subjective and objective evaluation  What to include in the report