Presentation is loading. Please wait.

Presentation is loading. Please wait.

ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal: www.pdesas.org.

Similar presentations


Presentation on theme: "ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal: www.pdesas.org."— Presentation transcript:

1 ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:

2 Navigate to the homeroom page: RIA Homeroom site.

3 Log in and if not a user then register for the site: Pause until entire room is registered or with a partner:

4 Home Page for information:
Open ALS

5 The ALS Box expands…………..

6 Assessment Literacy Series
-Module 3- Item Specifications IMT Orientation Draft 02Sept11-CS

7 Objectives Participants will:
Examine multiple choice and constructed response items/tasks. Develop items/tasks that fit the previously created specifications and blueprint in terms of: content accuracy; item type; cognitive load; and sufficiency.

8 Helpful Tools Participants may wish to reference the following: Guides
Handout #4 – DoK Chart Handout #5 – Item Examples Other “Stuff” DoK Chart II “Smart Book” Textbooks, teacher manuals, and other supplemental materials

9 Outline of Module 3 Module 3: Item Specifications
Multiple Choice Items Process Steps Constructed Response Items/Tasks Short Constructed Response Extended Constructed Response

10 Multiple Choice (MC) Guidelines
MC items consist of a stem and answer options Grades K-1 = 3 options, Grades 2-12 = 4 options MC items contain only one correct answer Other options (distractors) are the same structure and length as the answer The distractors should be plausible (realistic) Balance the placement of the correct answer Avoid answer options that provide clues to the answer

11 MC Guidelines (cont.) Answer options should be in ascending or descending order when possible Avoid “All of the above” and “None of the above” Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers are repeating the same behavior from item to item

12 Constructed Response (CR) Guidelines
Language is appropriate to the age and experience of the students Student expectations are clear- Explain vs. Discuss Describe vs. Comment State the extent of the expected answer Give three reasons vs. give some reasons Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers repeat the same behavior from item/task to item/task

13 CR Guidelines (cont.) Short Constructed Response (SCR) items/tasks:
One step to solve Requires a brief response (2-5 minutes) Worth up to 3 points Extended Constructed Response (ECR) items/tasks: Multiple steps to solve Requires 5-10 minutes to answer Worth 4 or more points Note: ALL CR items/tasks require a well-developed, scoring rubric.

14 Helpful Hints Scenarios and passages should be: Relatively short
Developmentally appropriate Sensitive to readability Performance expectations must state exactly what is to be observed and how it will be measured. Items/tasks are considered secure even in draft form. Copyrights must be handled appropriately. Images, graphs, and charts must be clear and of sufficient size for interpretation.

15 Item Tag Codification Each item/task requires a unique identifier.
Tags contain information used to code and identify items/tasks. Item tags typically contain: Item number Subject Grade Post test Item type DoK level Content standard number (0008.MTH.GR4.POST.MC-LV2-4OA1) Post Item Type DoK Standard ID Item # Subject Grade

16 Process Steps Review specification tables, blueprint, and targeted standards. Draft the item, passage, prompt, or scenarios. Insert any necessary tables, graphs, or images. For MC items, create the single correct answer and then identify distractors, which reflect common errors/misinterpretations. For CR items/tasks, create a scoring “rubric” that articulates different levels of performance, including a sample response for each level. Specify item/task unique ID: [e.g., item number. subject. grade. post. item type-DoK level-content standard number] Repeat Steps 1-4 for the remainder of the items/tasks needed to complete the blueprint.

17 QA Checklist Types, point values, and DoKs match the blueprint.
There are sufficient items/tasks to sample the targeted content. The items/tasks are developmentally appropriate for the intended test-takers. The correct answers and/or expected responses are clearly identified. Each item/task is assigned a unique ID.

18 Think-Pair-Share Based on the discussed item specifications, decide what is wrong with the following questions. 1. Some scientists believe that Pluto is __________. an escaped moon of Neptune usually the most distant planet in the solar system the name of Mickey Mouse’s dog all of the above 2. The people of Iceland __________. a country located just outside the Arctic Circle work to keep their culture alive claim to be descendants of the Aztecs the capital, Reykjavik, where arms talks have been held

19 Summary & Next Steps Summary Module 3: Item Specifications Next Steps
Developed performance measure items/tasks that match the applicable blueprints. Next Steps Module 4: Scoring Keys and Rubrics Given the items/tasks created, develop scoring keys and rubrics.

20 Assessment Literacy Series
-Module 4- Scoring Keys and Rubrics IMT Orientation Draft 02Sept11-CS

21 Objectives Participants will:
Develop scoring keys for all multiple choice items outlined with in the blueprint. Develop scoring rubrics for constructed response items/tasks that reflect a performance continuum.

22 Helpful Tools Participants may wish to reference the following: Guides
Handout #6 – Scoring Key Example Handout #7 – Rubric Examples Templates Template #5-Scoring Key-Rubric Stuff Performance Task Framework

23 Module 4: Scoring Keys and Rubrics Rubrics for Constructed Response
Outline of Module 4 Module 4: Scoring Keys and Rubrics Scoring Key for MC Process Steps Rubrics for Constructed Response Sample Answers Scoring Criteria

24 Scoring Keys

25 Scoring Keys Scoring Keys typically contain elements such as the following: Performance measure name or unique identifier Grade/Course Administration timeframe (e.g., fall, mid-term, final examination, spring, etc.) Item tag and type Maximum points possible Correct answer (MC) or rubric with sample answers or anchor papers (SCR & ECR) Assessment Literacy Series

26 Scoring Key Example [Handout #6]
Assessment Name Grade/Course Administration UAI (Unique Assessment Identifier) Algebra I End-of-Course Algebra I Post-test (Spring)   Item # Item Tag Item Type Point Value Answer 1 0001.MTH.ALGI.POST.MC-LV1-8EE2 MC C 2 0002.MTH.ALGI.POST.MC-LV1-ACED1 B 3 0003.MTH.ALGI.POST.MC-LV1-8EE1 4 0004.MTH.ALGI.POST.MC-LV1-ASSE1 D 5 0005.MTH.ALGI.POST.MC-LV2-ASSE1 A 6 0006.MTH.ALGI.POST.MC-LV2-7RP3 7 0007.MTH.ALGI.POST.MC-LV2-ACED1 8 0008.MTH.ALGI.POST.MC-LV2-ACED1 9 0009.MTH.ALGI.POST.MC-LV1-ACED1 10 0010.MTH.ALGI.POST.MC-LV1-AREI3 11 0011.MTH.ALGI.POST.MC-LV1-FIF1 12 0012.MTH.ALGI.POST.MC-LV2-ACED1 13 0013.MTH.ALGI.POST.ECR-LV2-NRN2 SCR See Scoring Rubric 14 0014.MTH.ALGI.POST.ECR-LV2-FIF1 ECR Assessment Literacy Series

27 Scoring Keys Scoring keys for MC items:
Answers within the key must represent a single, correct response. Answers should be validated once the key is developed to avoid human error. Validating answers should be done prior to form review. Items changed during the review stage must be revalidated to ensure the scoring key is correct. Assessment Literacy Series

28 Process Steps [Template #4]
Enter the assessment information at the top of the Scoring Key. Record the single, correct answer during item development. For SCR and ECR items/tasks, the scoring rubrics should be referenced in the answer column and put in the correct rubric table on the Rubric Template. Record the item number, item tag, item type, and point value. Record the MC answers in the answer column. For each CR item, include the general scoring rubric and sample response for each point value. Repeat Steps 1-4 until all items/tasks on the blueprint are reflected within the Scoring Key.

29 QA Checklist All items/tasks articulated on the blueprint are represented within the Scoring Key. MC items have been validated to ensure only one correct answer among the possible options provided exists. MC answers do not create a discernible pattern. MC answers are “balanced” among the possible options. Scoring Key answers are revalidated after the final operational form reviews are complete.

30 Scoring Rubrics

31 Holistic vs. Analytic Rubric Scoring
Holistic Scoring Provides a single score based on an overall determination of the student’s performance Assesses a student’s response as a whole for the overall quality Most difficult to calibrate different raters Analytic Scoring Identifies and assesses specific aspects of a response Multiple dimension scores are assigned Provides a logical combination of subscores to the overall assigned score Assessment Literacy Series

32 Rubric Scoring Considerations
Describe whether spelling and/or grammar will impact the final score. Avoid using words like “many”, “some”, and “few” without adding numeric descriptors to quantify these terms. Avoid using words that are subjective, such as “creativity” or “effort”. Avoid subjective adjectives such as “excellent” or “inadequate”. Assessment Literacy Series

33 SCR Rubric Example [Handout #7]
General Scoring Rubric 2 points The response gives evidence of a complete understanding of the problem. It is fully developed and clearly communicated. All parts of the problem are complete. There are no errors. 1 point The response gives evidence of a reasonable approach but also indicates gaps in conceptual understanding. Parts of the problem may be missing. The explanation may be incomplete. 0 points There is no response, or the work is completely incorrect or irrelevant. Assessment Literacy Series

34 SCR Rubric Example [Handout #7]
Sample Response: “In two complete sentences, explain why people should help save the rainforests.” 2 points The student’s response is written in complete sentences and contains two valid reasons for saving the rainforest. “People must save the rainforest to save the animals’ homes. People need to save the rainforest because we get ingredients for many medicines from there.” 1 point The student’s response contains only one reason. “People should save the rainforest because it is important and because people and animals need it.” Assessment Literacy Series

35 Rubrics for ECR Tasks Create content-based descriptions of the expected answer for each level of performance on the rubric. Provide an example of a fully complete/correct response along with examples of partially correct responses. Reference the item expectations in the rubric. Make the rubric as clear and concise as possible so that other scorers would assign exact/adjacent scores to the performance/work under observation. Assessment Literacy Series

36 ECR Rubric Example [Handout #7]
General Scoring Rubric 4 points The response provides all aspects of a complete interpretation and/or a correct solution. The response thoroughly addresses the points relevant to the concept or task. It provides strong evidence that information, reasoning, and conclusions have a definite logical relationship. It is clearly focused and organized, showing relevance to the concept, task, or solution process. 3 points The response provides the essential elements of an interpretation and/or a solution. It addresses the points relevant to the concept or task. It provides ample evidence that information, reasoning, and conclusions have a logical relationship. It is focused and organized, showing relevance to the concept, task, or solution process. 2 points The response provides a partial interpretation and/or solution. It somewhat addresses the points relevant to the concept or task. It provides some evidence that information, reasoning, and conclusions have a relationship. It is relevant to the concept and/or task, but there are gaps in focus and organization. 1 point The response provides an unclear, inaccurate interpretation and/or solution. It fails to address or omits significant aspects of the concept or task. It provides unrelated or unclear evidence that information, reasoning, and conclusions have a relationship. There is little evidence of focus or organization relevant to the concept, task, and/or solution process. 0 points The response does not meet the criteria required to earn one point. The student may have written on a different topic or written "I don't know." Assessment Literacy Series

37 ECR Rubric Example [Handout #7]
Sample Response: “List the steps of the Scientific Method. Briefly explain each one.” 4 points Ask a Question- Ask a question about something that you observe: How, What, When, Who, Which, Why, or Where? Do Background Research- Use library and Internet research to help you find the best way to do things. Construct a Hypothesis- Make an educated guess about how things work. Test Your Hypothesis- Do an experiment. Analyze Your Data and Draw a Conclusion- Collect your measurements and analyze them to see if your hypothesis is true or false. Communicate Your Results- Publish a final report in a scientific journal or by presenting the results on a poster. 3 points Ask a Question Do Background Research-Use library and Internet research. Construct a Hypothesis- An educated guess about how things work. Analyze Your Data and Draw a Conclusion Communicate Your Results 2 points Do Background Research Construct a Hypothesis Test Your Hypothesis 1 point Ask a Question, Hypothesis, Do an Experiment, Analyze Your Data 0 points “I don’t know.” Assessment Literacy Series

38 Process Steps [Template #4]
Create the item/task description for the student. Using a “generic” rubric, begin by modifying the language using specific criteria expected in the response to award the maximum number of points. Next, determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described] Using the “sample” rubric, create an example of a correct or possible answer for each level in the rubric. In review, ensure the item/task description for the student, the scoring rubric, and the sample rubric are aligned.

39 QA Checklist CR items/tasks have scoring rubrics that reflect a performance continuum. CR items/tasks include sample responses for each level of performance. CR scoring rubrics are clear and concise. CR scoring rubrics include all dimensions (aspects) of the tasks presented to the students. CR scoring rubrics avoid including non-cognitive (motivation, timeliness, etc.) or content irrelevant attributes.

40 Summary & Next Steps Summary Module 4: Scoring Keys & Rubrics
Developed a scoring key and rubrics for all items. Next Steps Module 5: Operational Forms & Administrative Guidelines Given the items/tasks developed, create an operational form with applicable administrative guidelines.

41 Assessment Literacy Series
-Module 5- Operational Forms and Guides IMT Orientation Draft 02Sept11-CS

42 Objectives Participants will:
Construct operational forms (performance measures) from developed items by: Organizing and sequencing items/tasks Constructing form Assigning item tags Aligning to the Test Specifications & Blueprint Develop teacher guidelines to administer the performance measure.

43 Helpful Tools Participants may wish to reference the following: Guides
Handout #8 – Operational Form Example Handout #9 – Administrative Guidelines Example Templates Template #5 – Operational Form Template #6 – Administrative Guidelines Other “Stuff” Performance Task Framework-Full

44 Outline of Module 5 Module 5: Operational Forms & Administrative Guidelines Operational Forms Process Steps Administrative Guidelines

45 Operational Forms [Handout #8]

46 Operational Forms All developed items will now be inserted into the proper format: the Operational Form. Operational Forms contain the following components: Cover page Student information section Test-taker instructions Item directions Items/Tasks Answer options/response area Item tags

47 Operational Forms (cont.)
An operational form is the actual performance measure that will be given to the student. The “forms” displays the sequence of items/passages/prompts the test-taker is to engage and respond to during the testing period.

48 Operational Forms (cont.)
Test Specifications & Blueprint should be the same for the pre-test and the post-test. Some complex measures, the “operational form”, may be a single prompt or task. All items must have an item tag. Consider the necessity for charts, graphs, images, figures, etc. Avoid creating a “queuing” or sequence bias among items/tasks.

49 Process Steps [Template #5]
Complete the cover design and demographic information necessary to reflect the performance measure developed and the context by which it will be used. Organize the items/tasks in a sequence that will maximize student engagement. Add directions to the test-taker instructions detailing the requirements for different types of items/tasks, especially for CR items/tasks. Add item tags. Screen and refine draft form to minimize “blank space”, verify picture, graph, table, and figure placement in relationship to the item/task, and ensure MC answer options do not drift from one page to the next.

50 QA Checklist Directions state what to do, where and how to respond, and the point value for items/tasks. Items/tasks on the operational form reflect the blueprint. Form layout minimizes “white space”, “crowding of items”, and distance from stem to response options/placement. Forms are free from unclear, small, or unnecessary images, figures, charts, or graphs.

51 Administrative Guidelines [Handout #9]

52 Administrative Guidelines
Administrative Guidelines are composed of three phases: Preparing for Test Administration Conducting Test Administration Post Administration

53 Administrative Guidelines (cont.)
Preparing for Test Administration Establish the testing window. Specify what forms are needed. Review administration steps and student directions. Review the performance measures before administration. Establish a secure testing environment. Determine the process for make-up tests.

54 Administrative Guidelines (cont.)
Conducting Test Administration Give directions for how to distribute forms. If possible, provide a script for the test administrator to use during administration. Provide directions for start/stop times, completing student information, and beginning and ending the testing session.

55 Administrative Guidelines (cont.)
Post Administration Provide instructions for collecting and securing forms. Provide scoring instructions. Include how results are reported. Determine overall score and/or performance status (e.g., met minimum status).

56 Administrative Guidelines (cont.)
Administrative guidelines must provide sufficient detail so that testing conditions are comparable between classrooms and across different schools. Activities within the testing environment (before, during, and after) must be articulated in a logical sequence. Test security and handling of scored materials must be outlined.

57 Process Steps Use the Administrative Guidelines Template to create administrative steps for before, during, and after the test. Explain any requirements or equipment necessary, including accommodations. State any ancillary materials (e.g., calculators) needed/allowed by the test-taker. Identify the approximate time needed to take the test. Include detailed “scripts” articulating exactly what is to be communicated to the test-takers, with particular emphasis on performance tasks. Include procedures for scoring, make-up tests, and handling completed tests.

58 QA Checklist Guidelines explain administrative steps before, during, and after testing. Requirements for completing the performance measure items/tasks, including conditions, equipment, and material, are included. Approximate time needed to complete the overall performance measure is provided. Detailed “scripts” articulate the information to be communicated in a standardized manner.

59 Summary Module 5: Operational Forms & Administrative Guidelines
Created operational forms (which reflected the developed blueprint) in conjunction with test administration guidelines. Next Steps Module 6: Form Reviews Given the operational form, scoring key, and administrative guidelines, evaluate the quality of the materials, with particular emphasis on items/tasks within the performance measure.

60 Joining the SLO Professional Learning Community on SAS.
Go to the SAS home page( Log in with your user name and password. If you do not have an account with SAS you will have to create one.

61 Enter your information on the log in page and submit.

62 Once you have successfully logged in and are at the SAS home page, go to Teacher Tools in the upper right corner.

63 Click on Teacher Tools, this will provide you with various tools.
Locate the button labeled “My Communities.”

64 This will open your membership to various Professional Learning Communities.
If you are not a member of the Student Learning Objectives PLC, type SLO in the search bar.

65 Once a member of the SLO community you will have access to communication with all other members and a calendar of upcoming events.

66 (This is located at the bottom of the SLO community page.)
Along with posting questions to the entire community you have access to the Digital Repository, in which SLO training materials and supporting documents are located. (This is located at the bottom of the SLO community page.)

67 PDE POC: Mr. O David Deitz
Contact Info PDE POC: Mr. O David Deitz RIA POC: Dr. JP Beaudoin

68 SAS Institute


Download ppt "ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal: www.pdesas.org."

Similar presentations


Ads by Google