Presentation is loading. Please wait.

Presentation is loading. Please wait.

Perri Kennedy, Shannon Rist, Chester Stevenson Boise State University

Similar presentations


Presentation on theme: "Perri Kennedy, Shannon Rist, Chester Stevenson Boise State University"— Presentation transcript:

1 Perri Kennedy, Shannon Rist, Chester Stevenson Boise State University
IPT IPT Tool Perri Kennedy, Shannon Rist, Chester Stevenson Boise State University Spring 2010 Continue

2 Dick and Carey’s Instructional Design Model
This training tool is intended to provide instructional designers with an overview of Dick and Carey’s Instructional Design Model. It is not meant to be an all inclusive list but rather a list of the models we felt most beneficial for each phase of design. Click continue. Continue

3 Summative Evaluations
The next four slides will explain how to use this Tool. Click the Blue Arrows in the text box to move between slides. Instructions: On the Home screen, you will see ten icons similar to this. Each icon represents a phase in Dick & Carey’s Instructional Design Model. Click each icon to learn more. 1 of 4 Design & Conduct Summative Evaluations 3

4 Goal Analysis What Happens: Instructions:
Navigation buttons are available at the top-right side of each page. The left arrow will take you to the previous page viewed. The Home button will take you back to the home menu. The right arrow will take you to the next page. 2 of 4 During the Goal Analysis Phase, designers should determine what will be accomplished during the learning event. Two tasks happen during the Goal Analysis phase: Needs Analysis - Used to gain an understanding of: Optimal performance or knowledge Actual or current performance or knowledge Feelings of trainees and others Causes of the problem from many perspectives Solutions to the problem from many perspectives (Rossett, 1987, p. 4) Goal Analysis - Determine the overall goals of the training course. 4

5 Goal Analysis What Happens: Instructions:
Click the Helpful Tips icon to get inside information regarding important Do’s and Don'ts for each phase. 3 of 4 During the Goal Analysis Phase, designers should determine what will be accomplished during the learning event. Two tasks happen during the Goal Analysis phase: Needs Analysis - Used to gain an understanding of: Optimal performance or knowledge Actual or current performance or knowledge Feelings of trainees and others Causes of the problem from many perspectives Solutions to the problem from many perspectives (Rossett, 1987, p. 4) Goal Analysis - Determine the overall goals of the training course. 5

6 Goal Analysis What Happens: Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished during the learning event. Two tasks happen during the Goal Analysis phase: Needs Analysis - Used to gain an understanding of: Optimal performance or knowledge Actual or current performance or knowledge Feelings of trainees and others Causes of the problem from many perspectives Solutions to the problem from many perspectives (Rossett, 1987, p. 4) Goal Analysis - Determine the overall goals of the training course. Instructions: Click the hyperlinks to get expanded information on important topics. 4 of 4 Open Tool 6

7 ? References Revise Instruction Conduct Instructional Analysis Write
Performance Objectives Develop Assessment Instruments Develop Instructional Strategy(ies) Develop & Select Instructional Materials Design & Conduct Formative Evaluations Identify Instructional Goals Analyze Learners & Contexts Design & Conduct Summative Evaluations ? References

8 Goal Analysis What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished during the learning event. Two tasks happen during the Goal Analysis phase: Needs Analysis - Used to gain an understanding of: Optimal performance or knowledge Actual or current performance or knowledge Feelings of trainees and others Causes of the problem from many perspectives Solutions to the problem from many perspectives (Rossett, 1987, p. 4) Goal Analysis – Used to determine the overall goals of the training course.

9 Goal Analysis Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer: Who are your learners? What are they like? What characteristics might affect your design of the learning environment? Find out information about learners: Cognitive abilities Previous experiences Motivational interests Personal learning styles What is the instructional need? According to the data collected in Step 1, what are learners currently not able to do that you need them to do? What leads you to believe that the need can be addressed by instruction? Using the data collected, create an organized summary describing your learning need. (as cited in “ID Final Project: Lesson 3,” 2003)

10 Goal Analysis Try Using:
Robert Mager’s (1997) Goal Analysis Model states: Step One: Write down the goal in outcome terms. Step Two: Jot down, in words and phrases, the performances that, if observed, would cause you to agree the goal was achieved. Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One and Two for any remaining abstractions (fuzzies) considered important. Step Four: Write a complete statement for each performance describing the nature, quality, or amount you will consider acceptable. Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each of these performances, would I be willing to say he or she had achieved the goal?’ When you can answer ‘yes,’ the analysis is finished (p. 86).

11 Goal Analysis Helpful Tips
Use action verbs to describe the performance objective and indicate observable behaviors.  Describe the desired behavior that should result from the training. Avoid using verbs like "know" or "understand" to describe the performance behavior, because these are not observable behaviors.  Don't write objectives that describe what the instructor or student will do in class - the objective describes the result, not the process. 11

12 Conduct Instructional Analysis
What Happens: When conducting an Instructional Analysis, designers discover what skills are needed to achieve the results identified from the goal analysis. The primary method is through a Task Analysis, which is a list of steps and skills used for each procedure in the course being designed. Additional Resources: Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer

13 Conduct Instructional Analysis
Try Using: Task Analysis information can be collected by: Observing employees on the job Interviewing employees and supervisors Reviewing documents, processes, policies, etc. for the job position (“Unit 2: Job Task Analysis,” n.d., p. 4) Step Action 1 Analyze learners and determine prerequisites. 2 Identify job functions. 3 Identify tasks within each function. 4 Identify stages of process. 5 Is the task procedural? If yes, identify the steps and go to Step 7 If no, go to Step 6. 6 Identify guidelines of the principle-based task. 7 Identify knowledge needed to complete the task.

14 Conduct Instructional Analysis
Helpful Tips Use the events as a guide for structuring the learning activities. When structuring learning activities, avoid including information that learners already know or don't need to know. 14

15 Analyze Learners and Contexts
What Happens: When conducting a Learner and Context Analysis, designers discover what knowledge, skills, abilities, and personalities their learners will bring to the training event.

16 Analyze Learners and Contexts
Try Using: Learner Analysis Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions, and personnel files determine: General characteristics: age, gender, language, culture Personal/social characteristics: maturity level, emotional level, expectations, aspirations, talents/interests, experience, physical capabilities Academic characteristics: education level, training levels completed, special courses completed, previous performance levels, test scores, GPA Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels, attention span, attitudes towards work or the subject Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or reflectively, sequentially or globally

17 Analyze Learners and Contexts
Try Using: Context Analysis – Two parts: Performance and Learning Context The Learning Context describes what the learning environment will be like. The Performance Context describes what the learners environment will be on the job.

18 Analyze Learners and Contexts
Try Using: Performance Context Nicholson (2004) explains four components of the performance context as: Managerial Support – How will supervisors and managers support learners on the job? Physical Aspects of the Site - What equipment, facilities, and tools will be available? Social Aspects of the Site – Will learners work alone or in teams? Will they work in the office or in the field? Relevance of Skills to Workplace – “How relevant are the new skills to the actual workplace? Are there physical, social, or motivational constraints to the use of the new skills” (Nicholson, M. 2004)? 18

19 Analyze Learners and Contexts
Try Using: Learning Context Nicholson (2004) explains four components of the learning context as: Number and Nature of Sites – What facilities and equipment will be available for training? Compatibility of the Site With the Instructional Requirements – Are there any limitations to using the available training site(s)? Compatibility of the Site With the Learner Needs – Does the site have necessary conveniences, necessary equipment, and adequate space available? Feasibility for Simulating the Workplace – How well can the actual work environment be simulated at the site? Can anything be done to make it more? 19

20 Analyze Learners and Contexts
Helpful Tips Determine what prior knowledge the learners have and how relevant information to be learned is to them. Don't assume what learners do and do not know. This can lead to unexpected and unwelcome surprises when you launch the project. 20

21 Write Performance Objectives
What Happens: When writing Performance Objectives, designers translate data from the needs and goal analysis into specific objectives. These objectives will be used later to measure the quality of instruction and learning that takes place. They will also be used to: Determine whether the instruction being developed relates to its goals Guide the development of evaluation tools Give learners an idea of what content they should focus on Objectives are different than goals. Goals are more of an overarching vision of what the course will accomplish. Objectives are measurable descriptions of what you want the learners to demonstrate on the job. Two methods that can be used to create objectives are: Mager’s Behavioral Objectives Bloom’s Taxonomy

22 Write Performance Objectives
Try Using: Robert Mager (1997) developed a method for developing Behavioral Objectives which consisted of: Performance – What should the learner be able to do on the job? Condition – Under what conditions will the performance occur on the job? Criterion – How well should the learner be able to perform on the job? Review the example below: # Performance (What will they do?) Condition (What Conditions?) Criterion (How Well?) 1 Learners should be able to create effective behavioral objectives Given the “Write Performance Objectives” portion of the IIDT That define how well learners should perform on the job

23 Write Performance Objectives
Try Using: The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional activities. Decide what level the Performance Objective falls under, then use an action verb from the list below in Mager’s Performance column. Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain: 1.     Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize, relate, recall, repeat, reproduce, state. 2.     Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate, recognize, report, restate, review, select, translate. 3.     Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret, operate, practice, schedule, sketch, solve, use, write. 4.     Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize, differentiate, discriminate, distinguish, examine, experiment, question, test. 5.     Synthesis - arrange, assemble, collect, compose, construct, create, design, develop, formulate, manage, organize, plan, prepare, propose, set up, write. 6.     Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate, judge, predict, rate, core, select, support, value, evaluate. See an example 23

24 Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy: # Blooms Taxonomy Level Performance (What will they do?) Condition (What Conditions?) Criterion (How Well?) 1 ( ) Knowledge ( ) Comprehension ( ) Application ( ) Analysis (X) Synthesis ( ) Evaluation Learners should be able to create effective behavioral objectives Given the “Write Performance Objectives” portion of the IIDT That define how well learners should perform on the job Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis level. When developing activities for this objective, designers should ask learners to “create effective behavioral objectives.” By making activities congruent with the correct level in Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.

25 Write Performance Objectives
Helpful Tips Analyze the desired performance carefully to determine the levels to be covered in the training. Higher on the taxonomy is not necessarily better. If Application is the appropriate highest taxonomy level, then stay at that level.

26 Develop Assessment Instruments
What Happens: Before designing training, develop Assessment Instruments to: Determine if learners have necessary prerequisites to learn skills used in this course Evaluate what knowledge, skills, and abilities learners gained during the course Document learners’ progress Aid in creating Formative and Summative evaluations Determine performance measures before development of lesson plan and instructional materials (“ID Final Project: Lesson 7,” 2003)

27 Develop Assessment Instruments
Try Using: ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop: Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite skills Pre-test – Prior to instruction, given to assess whether learners have already mastered some of the course skills determined during the instructional analysis Practice Tests – During instruction, give learners a chance to rehearse the new skills they are learning and allow for corrective feedback Post-tests – Following instruction, give to determine if learners have achieved the ability to carry out the performance objectives

28 Develop Assessment Instruments
Helpful Tips Make sure to begin with the desired performance and work backward to determine what will be needed to achieve objectives. Don't wait until you've developed the training before you look at how to assess it. Training design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009). 28

29 Develop Instructional Strategy
What Happens: When developing a Instructional Strategy, designers “identify and employ teaching strategies and techniques that most effectively achieve the performance objectives” (Gagné, 1992). Designing instruction is about more than choosing the mode of delivery. Much like a screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for learning. One method available to guide designers in developing instruction is Gagné’s 9 Events of Instruction.

30 Develop Instructional Strategy
Try Using: Robert Gagné’s (1992) 9 Events of Instruction Gain attention – Ensure learners are ready to learn Inform learners of objectives – Ensure learners know what they are going to learn Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn Present the content – Introduce the new material Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners better understand the material Elicit performance (practice) – Allow the learner to practice Provide feedback – Provide specific and immediate feedback to guide learners Assess performance – Give a post/final test to assess learners mastery of the material Enhance retention and transfer to the job – Create job aids, references, tools, etc. which learners can utilize for their job. Find a way to ensure what learners’ gained from the course will transfer to their jobs.

31 Develop Instructional Strategy
Helpful Tips Classify learning outcomes. Different types of learning require different types of training (eg; skills vs. attitude). The 9 events do not have to be performed in sequential order, as separate segments, or at all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9 Events are a flexible guideline, not an absolute blueprint. 31

32 Develop and Select Instructional Materials
What Happens: When developing and selecting Instructional Materials, designers select print and electronic instructional materials to use in the course. Ideally, existing materials should be used, although they may need improvement or revision.

33 Develop and Select Instructional Materials
Try Using: The materials may be in various forms: print, computer, audio, audio-video, etc. There are benefits and drawbacks to each media type depending on the budget and learning situation. See a table of media types along with benefits and considerations. The table on the following page was adapted from Strategies for developing Instructional Materials for the Interpersonal Domain (2010)

34 Develop and Select Instructional Materials
Type Benefits Considerations Simulations Permits independence in learning process Contextualizes content Can provide multiple perspectives Develops critical thinking skills Can be expensive Feedback important to success Training Games Highly motivational Encourages teamwork Uses problem solving skills Develops communication skills Difficult with large groups Can require extensive guidance to be effective Role Playing Introduces real world situations Promotes understanding of other positions Emphasizes working together Provides opportunities to give & receive feedback Interactive Games Engages learner Develops strategical thinking skills Best with individuals or small groups May require support materials to ensure learning Video Great for large groups Provides for safe observation Can include real life situations Can develop critical thinking Technology requirements Difficult to adapt Need discussion & practice opportunities Job Aids Provides for rapid instruction Inexpensive Can use with any size group Provides opportunities for self-assessment Good as a support tool Need practice opportunities to ensure transfer 34

35 Develop and Select Instructional Materials
Helpful Tips Consider both the work and the learner when designing a job aid. What steps need to be taken to complete the task? What is the experience level of learner? Avoid confusing the learner. Include only the steps necessary to complete the task. Use words that the learner can easily understand. Don't use industry jargon or long, obscure words. Include job aids that learners can keep for reference. 35

36 Design and Conduct Formative Evaluation
What Happens: When designing and conducting Formative Evaluations, designers gather data to revise and improve instruction as well as materials created for the course. The Formative Evaluation will attempt to answer the following questions: SIL International (1999) identifies seven questions to ask during a Formative Evaluation: Did you identify training needs correctly? Have you noticed other areas which need attention? Are there indications that the training objectives will be met? Do you need to revise the objectives? Are you fully covering training topics? Do you need to include additional training topics? Are the training methods appropriate or do you need to adjust them” (“How To Do Formative Training Evaluation?,” 1999)

37 Design and Conduct Formative Evaluation
Try Using: The Six Stages of Formative Evaluations: Dick and Carey (1996) lists six stages of Formative Evaluations: Design Review – Determine if the instructional design matches the analysis Expert Review – Ensure the content is accurate One-to-One – Determine course impact on ARCS factors Small Group – Verify feedback from one-on-one evaluations. Look for additional issues Field Trials – Verify feedback from small group evaluations. Look for context-related issues Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant

38 Design and Conduct Formative Evaluation
Helpful Tips Make sure to conduct a formative evaluation with a test group before official roll-out of training. Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction, your focus must be on whether or not they have learned the desired skills and can apply them to their jobs. Happy learners are not the same as better performers. 38

39 Design and Conduct Summative Evaluation
What Happens: When designing and conducting Summative Evaluations, designers study the effectiveness of the instruction as a whole. It begins after Formative Evaluations are complete, and the instruction has been implemented. Summative Evaluations provide information about how much the instruction has improved performance, and how the instruction has affected workplace performance.

40 Design and Conduct Summative Evaluation
Try Using: Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation: Level 1: Learner Reaction – Were the learners satisfied with the training? Level 2: Performance Evaluation – Did they gain the intended KSABs? Level 3: Behavior Evaluation – Are they applying their newly acquired KSABs to their job? Level 4: Effect on the Organization – To what degree did the training achieve the desired impact on the organization?

41 Design and Conduct Summative Evaluation
Helpful Tips Look at all four levels of Kirkpatrick. Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to the organization. Do not limit yourself to only the first two levels (Reaction and Learning). 41

42 Revise Instruction What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust the instruction intervention accordingly. You may have to revise your goals, objectives, or analyses as well as materials and methods. Revisions might include the following: Modify the instructional objective to focus it more clearly on the organizational goal Adjust assumptions about learners' prior knowledge of similar subjects Increase or decrease the speed at which new information is delivered Replace or delete less effective learning activities

43 Revise Instruction Helpful Tips
Ask yourself the following 3 questions: 1. What is my instructional strategy? 2. What is my budget? 3. What resources do I already have available?  After editing one phase, consider its effect on all other phases. 43

44 Sources References Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals (Handbook I: Cognitive domain). New York, NY: David McKay Company, Inc. Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins Publishers. Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.). Fort Worth, TX: Harcourt Brace Jovanovich College Publishers. How to do formative training evaluation. (1999). Retrieved from ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from

45 Sources References Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50 years, Retrieved from Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them (3rd ed.). Atlanta, GA: CEP. Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology Publications. Strategies for developing instructional materials for the interpersonal domain. (2010). Retrieved from Unit 2: Job task analysis. (n.d.). Retrieved from Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from 45


Download ppt "Perri Kennedy, Shannon Rist, Chester Stevenson Boise State University"

Similar presentations


Ads by Google