Download presentation
Presentation is loading. Please wait.
Published byBrendan Fisher Modified over 9 years ago
1
Summative Evaluation The Evaluation after implementation
2
Involves _______ data CollectingAnalyzingSummarizing
3
For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction
4
Effectiveness of Content Instruction solve the problem? Criterion created prior to evaluation? Was the criterion established in conjunction with the needs assessment?
5
Specifically Did learners achieve the objectives? Learners feeling about instruction? What were the costs? How much time did it take? Was instruction implemented as designed? What unexpected outcomes?
6
Alternative Approaches to Summative Evaluation ObjectivismSubjectivism
7
Objectivism Based on empiricism Answering questions on the bases of observed data Goal based and replicable, uses the scientific method
8
Subjectivism Employs expert judgment Includes qualitative methods observation and interviews evaluate content “Goal Free” evaluators haven’t a clue about the goals
9
Objectivism (limitations) Examine only a limited number of factors May miss critical effects
10
Subjectivism (limitations) Are not replicable Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation May miss critical effects
11
Designer Role in Summative Evaluation? Somewhat controversial
12
Timing of Summative Evaluation? Not in the first cycle
13
Summary Diagram Formative Design Reviews Expert Reviews One-to-one Eval. Small Group Eval. Field Trials Ongoing Eval. Summative Determine Goals of the Evaluation Select Orientation Select Design Design or Select Evaluation Measures Collect Data Analyze Data Report Results
14
Goals of the Evaluation What decisions must be made? What are the best questions? How practical is it to gather data? Who wants the answer to a question? How much uncertainty?
15
Orientation of Evaluation Goal-based or goal-free A middle ground? Quantitative or qualitative appropriate? Experimental or naturalistic approach?
16
Select Design of Evaluation Describes what data to collect When the data will be collected And under what conditions Issues to consider: How much confidence must we have that the instruction caused the learning? (internal validity) How important is the generalizability? (external validity) How much control do we have over the instructional situation?
17
Design or Select Evaluation Measures Payoff outcomes Is the problem solved? Costs avoided Increased outputs Improved quality Improved efficiency
18
Design or Select Evaluation Measures (2) Learning Outcomes Use instrument you’ve already developed for the summative evaluation But measure the entire program
19
Design or Select Evaluation Measures (3) Attitudes Rarely the primary payoff goals Ask about learner attitudes toward learning instructional materials subject matter Indices of appeal attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement
20
Design or Select Evaluation Measures (4) Level of Implementation degree to which the instruction was implemented Costs Cost-feasibility Cost-effectiveness
21
Alternative Designs Instruction then posttest Pretest then instruction then posttest
22
The Report Summary Background Needs assessment, audience, context, program description Description of evaluation study Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost- effectiveness info., analysis of unintentional outcomes
23
The Report (continued) Results Outcomes, implementation, cost- effectiveness info., unintentional outcomes Discussion causal relationship between program & results Limitation of study Conclusion & Recommendations
24
Summary Summative evaluation is after implementation Limitations of subjective and objective evaluation What to include in the report
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.