Presentation is loading. Please wait.

Presentation is loading. Please wait.

ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:

Similar presentations

Presentation on theme: "ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:"— Presentation transcript:


2 ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:

3 Navigate to the homeroom page: RIA Homeroom site.

4 Log in and if not a user then register for the site: Pause until entire room is registered or with a partner:

5 Home Page for information: Open ALS

6 The ALS Box expands…………..

7 Assessment Literacy Series 6 -Module 3- Item Specifications

8 Participants will: 1. Examine multiple choice and constructed response items/tasks. 2. Develop items/tasks that fit the previously created specifications and blueprint in terms of:  content accuracy;  item type;  cognitive load; and  sufficiency. 7 Objectives

9 Participants may wish to reference the following: Guides Handout #4 – DoK Chart Handout #5 – Item Examples Other “Stuff” DoK Chart II “Smart Book” Textbooks, teacher manuals, and other supplemental materials 8 Helpful Tools

10 9 Outline of Module 3 Module 3: Item Specifications Multiple Choice Items Constructed Response Items/Tasks Short Constructed Response Extended Constructed Response Process Steps

11 MC items consist of a stem and answer options Grades K-1 = 3 options, Grades 2-12 = 4 options MC items contain only one correct answer Other options (distractors) are the same structure and length as the answer The distractors should be plausible (realistic) Balance the placement of the correct answer Avoid answer options that provide clues to the answer 10 Multiple Choice (MC) Guidelines

12 Answer options should be in ascending or descending order when possible Avoid “All of the above” and “None of the above” Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers are repeating the same behavior from item to item 11 MC Guidelines (cont.)

13 Constructed Response (CR) Guidelines Language is appropriate to the age and experience of the students Student expectations are clear- Explain vs. Discuss Describe vs. Comment State the extent of the expected answer Give three reasons vs. give some reasons Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers repeat the same behavior from item/task to item/task 12

14 Short Constructed Response (SCR) items/tasks: One step to solve Requires a brief response (2-5 minutes) Worth up to 3 points Extended Constructed Response (ECR) items/tasks: Multiple steps to solve Requires 5-10 minutes to answer Worth 4 or more points Note: ALL CR items/tasks require a well-developed, scoring rubric. 13 CR Guidelines (cont.)

15 Scenarios and passages should be: Relatively short Developmentally appropriate Sensitive to readability Performance expectations must state exactly what is to be observed and how it will be measured. Items/tasks are considered secure even in draft form. Copyrights must be handled appropriately. Images, graphs, and charts must be clear and of sufficient size for interpretation. 14 Helpful Hints

16 Each item/task requires a unique identifier. ◦ Tags contain information used to code and identify items/tasks. ◦ Item tags typically contain: Item number Subject Grade Post test Item type DoK level Content standard number (0008.MTH.GR4.POST.MC-LV2-4OA1) 15 Item Tag Codification Item #Subject Grade Post Item Type DoK Standard ID

17 1. Review specification tables, blueprint, and targeted standards. 2. Draft the item, passage, prompt, or scenarios. Insert any necessary tables, graphs, or images. 3. For MC items, create the single correct answer and then identify distractors, which reflect common errors/misinterpretations. For CR items/tasks, create a scoring “rubric” that articulates different levels of performance, including a sample response for each level. 4. Specify item/task unique ID: [ e.g., item number. subject. grade. post. item type-DoK level-content standard number ] 5. Repeat Steps 1-4 for the remainder of the items/tasks needed to complete the blueprint. 16 Process Steps

18 QA Checklist Types, point values, and DoKs match the blueprint. There are sufficient items/tasks to sample the targeted content. The items/tasks are developmentally appropriate for the intended test-takers. The correct answers and/or expected responses are clearly identified. Each item/task is assigned a unique ID. 17

19 Think-Pair-Share Based on the discussed item specifications, decide what is wrong with the following questions. 1. Some scientists believe that Pluto is __________. A. an escaped moon of Neptune B. usually the most distant planet in the solar system C. the name of Mickey Mouse’s dog D. all of the above 2. The people of Iceland __________. A. a country located just outside the Arctic Circle B. work to keep their culture alive C. claim to be descendants of the Aztecs D. the capital, Reykjavik, where arms talks have been held 18

20 Summary Module 3: Item Specifications Developed performance measure items/tasks that match the applicable blueprints. Next Steps Module 4: Scoring Keys and Rubrics Given the items/tasks created, develop scoring keys and rubrics. 19 Summary & Next Steps

21 Assessment Literacy Series 20 -Module 4- Scoring Keys and Rubrics

22 Participants will: 1. Develop scoring keys for all multiple choice items outlined with in the blueprint. 2. Develop scoring rubrics for constructed response items/tasks that reflect a performance continuum. 21 Objectives

23 Participants may wish to reference the following: Guides Handout #6 – Scoring Key Example Handout #7 – Rubric Examples Templates Template #5-Scoring Key-Rubric Stuff  Performance Task Framework 22 Helpful Tools

24 23 Outline of Module 4 Module 4: Scoring Keys and Rubrics Scoring Key for MC Rubrics for Constructed Response Sample Answers Scoring Criteria Process Steps

25 24 Scoring Keys

26 Scoring Keys typically contain elements such as the following: Performance measure name or unique identifier Grade/Course Administration timeframe (e.g., fall, mid-term, final examination, spring, etc.) Item tag and type Maximum points possible Correct answer (MC) or rubric with sample answers or anchor papers (SCR & ECR) 25

27 Scoring Key Example [Handout #6] 26 Assessment NameGrade/CourseAdministration UAI (Unique Assessment Identifier) Algebra I End-of-CourseAlgebra IPost-test (Spring) 01.003.1112 Item # Item TagItem Type Point Value Answer 1 0001.MTH.ALGI.POST.MC-LV1-8EE2MC 1C 20002.MTH.ALGI.POST.MC-LV1-ACED1 MC 1B 30003.MTH.ALGI.POST.MC-LV1-8EE1 MC 1C 40004.MTH.ALGI.POST.MC-LV1-ASSE1 MC 1D 50005.MTH.ALGI.POST.MC-LV2-ASSE1 MC 1A 60006.MTH.ALGI.POST.MC-LV2-7RP3 MC 1C 70007.MTH.ALGI.POST.MC-LV2-ACED1 MC 1D 80008.MTH.ALGI.POST.MC-LV2-ACED1 MC 1D 90009.MTH.ALGI.POST.MC-LV1-ACED1 MC 1B 100010.MTH.ALGI.POST.MC-LV1-AREI3 MC 1A 110011.MTH.ALGI.POST.MC-LV1-FIF1 MC 1D 120012.MTH.ALGI.POST.MC-LV2-ACED1 MC 1A 130013.MTH.ALGI.POST.ECR-LV2-NRN2 SCR 2See Scoring Rubric 140014.MTH.ALGI.POST.ECR-LV2-FIF1 ECR 4See Scoring Rubric

28 Scoring Keys Scoring keys for MC items: Answers within the key must represent a single, correct response. Answers should be validated once the key is developed to avoid human error. Validating answers should be done prior to form review. Items changed during the review stage must be revalidated to ensure the scoring key is correct. 27

29 Process Steps [Template #4] 28 1. Enter the assessment information at the top of the Scoring Key. 2. Record the single, correct answer during item development. For SCR and ECR items/tasks, the scoring rubrics should be referenced in the answer column and put in the correct rubric table on the Rubric Template. 3. Record the item number, item tag, item type, and point value. 4. Record the MC answers in the answer column. For each CR item, include the general scoring rubric and sample response for each point value. 5. Repeat Steps 1-4 until all items/tasks on the blueprint are reflected within the Scoring Key.

30 QA Checklist All items/tasks articulated on the blueprint are represented within the Scoring Key. MC items have been validated to ensure only one correct answer among the possible options provided exists. MC answers do not create a discernible pattern. MC answers are “balanced” among the possible options. Scoring Key answers are revalidated after the final operational form reviews are complete. 29

31 30 Scoring Rubrics

32 Holistic vs. Analytic Rubric Scoring Holistic Scoring Provides a single score based on an overall determination of the student’s performance Assesses a student’s response as a whole for the overall quality Most difficult to calibrate different raters Analytic Scoring Identifies and assesses specific aspects of a response Multiple dimension scores are assigned Provides a logical combination of subscores to the overall assigned score 31

33 Rubric Scoring Considerations Describe whether spelling and/or grammar will impact the final score. Avoid using words like “many”, “some”, and “few” without adding numeric descriptors to quantify these terms. Avoid using words that are subjective, such as “creativity” or “effort”. Avoid subjective adjectives such as “excellent” or “inadequate”. 32

34 SCR Rubric Example [Handout #7] 33 General Scoring Rubric 2 points The response gives evidence of a complete understanding of the problem. It is fully developed and clearly communicated. All parts of the problem are complete. There are no errors. 1 point The response gives evidence of a reasonable approach but also indicates gaps in conceptual understanding. Parts of the problem may be missing. The explanation may be incomplete. 0 points There is no response, or the work is completely incorrect or irrelevant.

35 SCR Rubric Example [Handout #7] 34 Sample Response: “In two complete sentences, explain why people should help save the rainforests.” 2 points The student’s response is written in complete sentences and contains two valid reasons for saving the rainforest. “People must save the rainforest to save the animals’ homes. People need to save the rainforest because we get ingredients for many medicines from there.” 1 point The student’s response contains only one reason. “People should save the rainforest because it is important and because people and animals need it.”

36 Rubrics for ECR Tasks Create content-based descriptions of the expected answer for each level of performance on the rubric. Provide an example of a fully complete/correct response along with examples of partially correct responses. Reference the item expectations in the rubric. Make the rubric as clear and concise as possible so that other scorers would assign exact/adjacent scores to the performance/work under observation. 35

37 ECR Rubric Example [Handout #7] 36 General Scoring Rubric 4 points The response provides all aspects of a complete interpretation and/or a correct solution. The response thoroughly addresses the points relevant to the concept or task. It provides strong evidence that information, reasoning, and conclusions have a definite logical relationship. It is clearly focused and organized, showing relevance to the concept, task, or solution process. 3 points The response provides the essential elements of an interpretation and/or a solution. It addresses the points relevant to the concept or task. It provides ample evidence that information, reasoning, and conclusions have a logical relationship. It is focused and organized, showing relevance to the concept, task, or solution process. 2 points The response provides a partial interpretation and/or solution. It somewhat addresses the points relevant to the concept or task. It provides some evidence that information, reasoning, and conclusions have a relationship. It is relevant to the concept and/or task, but there are gaps in focus and organization. 1 point The response provides an unclear, inaccurate interpretation and/or solution. It fails to address or omits significant aspects of the concept or task. It provides unrelated or unclear evidence that information, reasoning, and conclusions have a relationship. There is little evidence of focus or organization relevant to the concept, task, and/or solution process. 0 points The response does not meet the criteria required to earn one point. The student may have written on a different topic or written "I don't know."

38 ECR Rubric Example [Handout #7] 37 Sample Response: “List the steps of the Scientific Method. Briefly explain each one.” 4 points 1.Ask a Question- Ask a question about something that you observe: How, What, When, Who, Which, Why, or Where? 2.Do Background Research- Use library and Internet research to help you find the best way to do things. 3.Construct a Hypothesis- Make an educated guess about how things work. 4.Test Your Hypothesis- Do an experiment. 5.Analyze Your Data and Draw a Conclusion- Collect your measurements and analyze them to see if your hypothesis is true or false. 6.Communicate Your Results- Publish a final report in a scientific journal or by presenting the results on a poster. 3 points 1.Ask a Question 2.Do Background Research-Use library and Internet research. 3.Construct a Hypothesis- An educated guess about how things work. 4.Test Your Hypothesis- Do an experiment. 5.Analyze Your Data and Draw a Conclusion 6.Communicate Your Results 2 points 1.Ask a Question 2.Do Background Research 3.Construct a Hypothesis 4.Test Your Hypothesis 5.Analyze Your Data and Draw a Conclusion 6.Communicate Your Results 1 point Ask a Question, Hypothesis, Do an Experiment, Analyze Your Data 0 points “I don’t know.”

39 Process Steps [Template #4] 38 1. Create the item/task description for the student. 2. Using a “generic” rubric, begin by modifying the language using specific criteria expected in the response to award the maximum number of points. 3. Next, determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described] 4. Using the “sample” rubric, create an example of a correct or possible answer for each level in the rubric. 5. In review, ensure the item/task description for the student, the scoring rubric, and the sample rubric are aligned.

40 QA Checklist CR items/tasks have scoring rubrics that reflect a performance continuum. CR items/tasks include sample responses for each level of performance. CR scoring rubrics are clear and concise. CR scoring rubrics include all dimensions (aspects) of the tasks presented to the students. CR scoring rubrics avoid including non-cognitive (motivation, timeliness, etc.) or content irrelevant attributes. 39

41 Summary Module 4: Scoring Keys & Rubrics Developed a scoring key and rubrics for all items. Next Steps Module 5: Operational Forms & Administrative Guidelines Given the items/tasks developed, create an operational form with applicable administrative guidelines. 40 Summary & Next Steps

42 Assessment Literacy Series 41 -Module 5- Operational Forms and Guides

43 Participants will: 1. Construct operational forms (performance measures) from developed items by: Organizing and sequencing items/tasks Constructing form Assigning item tags Aligning to the Test Specifications & Blueprint 2. Develop teacher guidelines to administer the performance measure. 42 Objectives

44 Participants may wish to reference the following: Guides Handout #8 – Operational Form Example Handout #9 – Administrative Guidelines Example Templates Template #5 – Operational Form Template #6 – Administrative Guidelines Other “Stuff”  Performance Task Framework-Full 43 Helpful Tools

45 44 Outline of Module 5 Module 5: Operational Forms & Administrative Guidelines Operational Forms Administrative Guidelines Process Steps

46 Operational Forms [Handout #8] 45

47 Operational Forms All developed items will now be inserted into the proper format: the Operational Form. Operational Forms contain the following components: Cover page Student information section Test-taker instructions Item directions Items/Tasks Answer options/response area Item tags 46

48 Operational Forms (cont.) An operational form is the actual performance measure that will be given to the student. The “forms” displays the sequence of items/passages/prompts the test-taker is to engage and respond to during the testing period. 47

49 Operational Forms (cont.) Test Specifications & Blueprint should be the same for the pre-test and the post-test. Some complex measures, the “operational form”, may be a single prompt or task. All items must have an item tag. Consider the necessity for charts, graphs, images, figures, etc. Avoid creating a “queuing” or sequence bias among items/tasks. 48

50 Process Steps [Template #5] 1. Complete the cover design and demographic information necessary to reflect the performance measure developed and the context by which it will be used. 2. Organize the items/tasks in a sequence that will maximize student engagement. 3. Add directions to the test-taker instructions detailing the requirements for different types of items/tasks, especially for CR items/tasks. 4. Add item tags. 5. Screen and refine draft form to minimize “blank space”, verify picture, graph, table, and figure placement in relationship to the item/task, and ensure MC answer options do not drift from one page to the next. 49

51 QA Checklist Directions state what to do, where and how to respond, and the point value for items/tasks. Items/tasks on the operational form reflect the blueprint. Form layout minimizes “white space”, “crowding of items”, and distance from stem to response options/placement. Forms are free from unclear, small, or unnecessary images, figures, charts, or graphs. 50

52 Administrative Guidelines [Handout #9] 51

53 Administrative Guidelines Administrative Guidelines are composed of three phases: Preparing for Test Administration Conducting Test Administration Post Administration 52

54 Administrative Guidelines (cont.) Preparing for Test Administration Establish the testing window. Specify what forms are needed. Review administration steps and student directions. Review the performance measures before administration. Establish a secure testing environment. Determine the process for make-up tests. 53

55 Administrative Guidelines (cont.) Conducting Test Administration Give directions for how to distribute forms. If possible, provide a script for the test administrator to use during administration. Provide directions for start/stop times, completing student information, and beginning and ending the testing session. 54

56 Administrative Guidelines (cont.) Post Administration Provide instructions for collecting and securing forms. Provide scoring instructions. Include how results are reported. Determine overall score and/or performance status (e.g., met minimum status). 55

57 Administrative Guidelines (cont.) Administrative guidelines must provide sufficient detail so that testing conditions are comparable between classrooms and across different schools. Activities within the testing environment (before, during, and after) must be articulated in a logical sequence. Test security and handling of scored materials must be outlined. 56

58 1. Use the Administrative Guidelines Template to create administrative steps for before, during, and after the test. 2. Explain any requirements or equipment necessary, including accommodations. State any ancillary materials (e.g., calculators) needed/allowed by the test-taker. 3. Identify the approximate time needed to take the test. 4. Include detailed “scripts” articulating exactly what is to be communicated to the test-takers, with particular emphasis on performance tasks. 5. Include procedures for scoring, make-up tests, and handling completed tests. 57 Process Steps

59 QA Checklist Guidelines explain administrative steps before, during, and after testing. Requirements for completing the performance measure items/tasks, including conditions, equipment, and material, are included. Approximate time needed to complete the overall performance measure is provided. Detailed “scripts” articulate the information to be communicated in a standardized manner. 58

60 Module 5: Operational Forms & Administrative Guidelines Created operational forms (which reflected the developed blueprint) in conjunction with test administration guidelines. Next Steps Module 6: Form Reviews Given the operational form, scoring key, and administrative guidelines, evaluate the quality of the materials, with particular emphasis on items/tasks within the performance measure. 59 Summary

61 Joining the SLO Professional Learning Community on SAS. Go to the SAS home page( ) Log in with your user name and password. If you do not have an account with SAS you will have to create one.

62 Enter your information on the log in page and submit.

63 Once you have successfully logged in and are at the SAS home page, go to Teacher Tools in the upper right corner.

64 Click on Teacher Tools, this will provide you with various tools. Locate the button labeled “My Communities.”

65 This will open your membership to various Professional Learning Communities. If you are not a member of the Student Learning Objectives PLC, type SLO in the search bar.

66 Once a member of the SLO community you will have access to communication with all other members and a calendar of upcoming events.

67 Along with posting questions to the entire community you have access to the Digital Repository, in which SLO training materials and supporting documents are located. (This is located at the bottom of the SLO community page.)

68 Contact Info PDE POC: Mr. O David Deitz RIA POC: Dr. JP Beaudoin

69 SAS Institute 12.10.13

Download ppt "ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:"

Similar presentations

Ads by Google