Revising instructional materials

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Curriculum Development and Course Design
CDI Module 10: A Review of Effective Skills Training
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
The Instructional Design Process November 9, 2000.
Managing Learning and Knowledge Capital Human Resource Development: Chapter 11 Evaluation Copyright © 2010 Tilde University Press.
Summative Evaluation The Evaluation after implementation.
Effective Implementation Formative and Summative Project Evaluation.
Chapter 1 Introduction to Instructional Design. 首页 上页返回下页.
Kirkpatrick.
Identifying Content and Specifying Behaviors
Formative and Summative Evaluations
Principles of High Quality Assessment
The ADDIE Instructional Design Process
Literacy Textual: The ability to read and write Oral: The ability to listen and speak Visual: The ability to interpret visual messages accurately and.
Essay Assessment Tasks
Instructional System Design
INSTRUCTIONAL MATERIALS: PRINCIPLES, DESIGN, UTILIZATION AND EVALUTION
Instructional Design Diana Fisher. Instructional Design Instructional Design (ID) is a dynamic process with constant movement back and forth between steps.
Instructional Design Eman Almasruhi.
Instructional Design Aldo Prado. Instructional Design Instructional design is the process of easing the acquisition of knowledge and making it more efficient.
10/08/05Slide 1 Instructional Systems Design in Distance Education Goal: This lesson will discuss Instructional Systems Design as it relates to distance.
ADDIE Instructional Design Model
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
LEARNING DIFFERENCES - AGENCY SELF-ASSESSMENT GUIDE Program Year A tool for identifying program improvement and professional development needs.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
1 QIM 501- INSTRUCTIONAL DESIGN AND DELIVERY Dick & Carey Instructional Design Module Prepared by :- Omar Abdullah M. Al-Maktari PQM0025/08 Lecturer :-
Additional Unit 2 Lecture Notes New Instructional Design Focus School of Education Additional Unit 2 Lecture Notes New Instructional Design Focus School.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
SLB /04/07 Thinking and Communicating “The Spiritual Life is Thinking!” (R.B. Thieme, Jr.)
Free Screen Cast 09ETG03 MinSeon Kim JiHye Park JiEun Yoon SeulAh Jung.
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
EDU 385 Education Assessment in the Classroom
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Preface the field of instructional design has continued to grow both as an area of study and as a profession. Increasing numbers of colleges and universities.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Subgrant Goals and Activities Frostburg State University.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Chapter 14: Affective Assessment
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
The Instructional Design Process
Chapter 5 Informal Assessment.
Barry Williams1 Systematic Planning for Materials and Media Utilization TRDEV 531.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Barry Williams1 Designing & Conducting Formative Evaluation Dick & Carey Chp. 10.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Dick & Carey Instructional Design Model Sabri Elamin ED 6615.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
© 2013, KDE and KASA. All rights reserved. FOUNDATIONS OF STUDENT GROWTH GOAL SETTING: DETERMINING STUDENT NEEDS SETTING A BASELINE What do my students.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Incorporating Instructional Design into Library Instruction Classes NEFLIN Live Online July 7, 2011.
COM 535, S08 Designing and Conducting Formative Evaluations April 7, 2008.
Academic Seminar – Week 6 Lesson Plans & Formative Assessment Graphs.
The pre- post assessment 3 points1 point Introduction Purpose Learning targets Instruments Eliminate bias & distortion Writing conventions.
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Other Testing Issues Chapter 7 Red book.
Instructional Design Models
RESEARCH TOOLS FOR UNDERSTANDING SPORTS CONSUMERS
Prepared by: Toni Joy Thurs Atayoc, RMT
Model of instructional systems design: Dick & Carey model
REPORTING.
Adjunct Training – August 2016 | Jason Anderson
Chapter 4 Instructional Media and Technologies for Learning
Presentation transcript:

Revising instructional materials Chapter 11 Revising instructional materials

The Dick and Carey Model Conduct Instructional Analysis Revise Instruction Assess Needs to identify Goal(s) Write Performance Objectives Develop Assessment Instruments Develop Instructio nal Strategy Develop and Select Instructional Materials Design and Conduct Formative Evaluation of Instruction Analyze Learners and Contexts Design and Conduct Summative Evaluation

Objectives Describe various methods for summarizing data obtained from formative evaluation studies. Summarize data obtained from formative evaluation studies. Given formative evaluation data for a set of instructional materials, identify problem in the materials, and suggest revisions for the materials.

Two basic types of revisions The first is changes that are made to the content or substance of the materials to make them more accurate or more effective as a learning tool. The second type of change is related to the procedures employed in using your materials.

Analyzing data from one-to-one trials The designer has five kinds of basic information available: Learner characteristics and entry behaviors Direct responses to the instruction Learning time Posttest performance Responses to an attitude questionnaire.

Analyzing data from one-to-one trials The steps of analyzing data The first step is to describe the learners and to indicate their performance on any entry-behavior measures. Next, the designer should bring together all the comments and suggestions about the instruction that resulted from going through it with each learner. Begin by obtaining individual item performance and then combine item scores for each objective and for a total score.

Analyzing data from one-to-one trials Revise the instruction: Try to determine, based in learner performance, whether your rubric or test items are fault. If flawed, then changes should be made to make them consistent with the objectives and the intent of the instruction. If the items are satisfactory, and the learners performed poorly, then the instruction must be changed.

Analyzing data from one-to-one trials You have three sources of suggestions for change: learner suggestions, learner performance, and your own reactions to the instruction.

Analyzing data from small-group and field trials The available data typically include the following: item performance on the pretest, posttest, responses to an attitude questionnaire; learning and testing time; comments made directly in the materials.

Analyzing data from small-group and field trials The purpose for the item-by-objective analysis is threefold: To determine the difficulty of each item for the group To determine the difficulty of each objective for the group 3. To determine the consistency with which the set of items within an objective measures learners’ performance on the objective.

Analyzing data from small-group and field trials Learners’ item–by-objective performance From these data the designer could infer that: The group selected was appropriate for the evaluation The instruction covered skills not previously mastered by the group, and The instruction was effective in improving learners’ skills.

Why the individual item information is required? Item information can be useful in deciding whether there are particular problems with the item or whether it is effectively measuring the performance described in its corresponding objective. Individual item information can be used to identify the nature of the difficulties learners are having with the instruction. Individual item data can be combined to indicate learner performance on an objective, and eventually, on the entire test.

Graphing learners’ performances Another way to display data is through various graphing techniques. Another graphic technique for summarizing formative evaluation data involves the instructional analysis chart.

Other types of data It has been found that a good way to summarize data from an attitude questionnaire . Another important type of data is the comments obtained from learners, from other instructors involved in the formative evaluation, and from subject-matter-experts who react to the materials. The final type of data summary you may wish to prepare is related to any alternative approaches you may have used during either the small-group or field-trial evaluations.

Sequence for examining data Entry behaviors Pretests and posttests Instructional strategy Learning time Instructional procedures

Revision process We suggest that as you begin the revision process ,you summarize your data as suggested in this chapter. Given all the data from a small-group or fields-trial evaluation, the designer must make decisions about how to make the revisions. Otherwise, the strategies suggested for revising instruction following the one-to-one evaluations also apply at this point-namely, use the data , your experience, and sound learning principles as the bases for your revisions.

Revising selected materials When working with selected materials, however, there is little opportunity to revise the materials directly ,especially if they are commercially produced and copyrighted. with copyrighted materials, the instructor can consider the following adaptations for future trials: (1) omit portions of the instruction, (2) include other available materials, or (3) simply develop supplementary instruction.

Revising instructor-led instruction Instructors working from an instructor’s guide have the same flexibility as the developer for changing instruction. The instructor’s notes from the guide should reflect questions raised by learners and responses to those questions. Unlike using written instructional materials, the instructor can revise the presentation during its implementation and note the reasons for the change.