6 Assessment Literacy Series -Module 3-Item SpecificationsIMT Orientation Draft 02Sept11-CS
7 Objectives Participants will: Examine multiple choice and constructed response items/tasks.Develop items/tasks that fit the previously created specifications and blueprint in terms of:content accuracy;item type;cognitive load; andsufficiency.
8 Helpful Tools Participants may wish to reference the following: Guides Handout #4 – DoK ChartHandout #5 – Item ExamplesOther “Stuff”DoK Chart II“Smart Book”Textbooks, teacher manuals, and other supplemental materials
10 Multiple Choice (MC) Guidelines MC items consist of a stem and answer optionsGrades K-1 = 3 options, Grades 2-12 = 4 optionsMC items contain only one correct answerOther options (distractors) are the same structure and length as the answerThe distractors should be plausible (realistic)Balance the placement of the correct answerAvoid answer options that provide clues to the answer
11 MC Guidelines (cont.)Answer options should be in ascending or descending order when possibleAvoid “All of the above” and “None of the above”Directions state what to do, where and how to respond, and point valuesRefrain from adding directions when test-takers are repeating the same behavior from item to item
12 Constructed Response (CR) Guidelines Language is appropriate to the age and experience of the studentsStudent expectations are clear-Explain vs. DiscussDescribe vs. CommentState the extent of the expected answerGive three reasons vs. give some reasonsDirections state what to do, where and how to respond, and point valuesRefrain from adding directions when test-takers repeat the same behavior from item/task to item/task
13 CR Guidelines (cont.) Short Constructed Response (SCR) items/tasks: One step to solveRequires a brief response (2-5 minutes)Worth up to 3 pointsExtended Constructed Response (ECR) items/tasks:Multiple steps to solveRequires 5-10 minutes to answerWorth 4 or more pointsNote: ALL CR items/tasks require a well-developed, scoring rubric.
14 Helpful Hints Scenarios and passages should be: Relatively short Developmentally appropriateSensitive to readabilityPerformance expectations must state exactly what is to be observed and how it will be measured.Items/tasks are considered secure even in draft form.Copyrights must be handled appropriately.Images, graphs, and charts must be clear and of sufficient size for interpretation.
15 Item Tag Codification Each item/task requires a unique identifier. Tags contain information used to code and identify items/tasks.Item tags typically contain:Item numberSubjectGradePost testItem typeDoK levelContent standard number(0008.MTH.GR4.POST.MC-LV2-4OA1)PostItem TypeDoKStandard IDItem #SubjectGrade
16 Process StepsReview specification tables, blueprint, and targeted standards.Draft the item, passage, prompt, or scenarios. Insert any necessary tables, graphs, or images.For MC items, create the single correct answer and then identify distractors, which reflect common errors/misinterpretations. For CR items/tasks, create a scoring “rubric” that articulates different levels of performance, including a sample response for each level.Specify item/task unique ID: [e.g., item number. subject. grade. post. item type-DoK level-content standard number]Repeat Steps 1-4 for the remainder of the items/tasks needed to complete the blueprint.
17 QA Checklist Types, point values, and DoKs match the blueprint. There are sufficient items/tasks to sample the targeted content.The items/tasks are developmentally appropriate for the intended test-takers.The correct answers and/or expected responses are clearly identified.Each item/task is assigned a unique ID.
18 Think-Pair-ShareBased on the discussed item specifications, decide what is wrong with the following questions.1. Some scientists believe that Pluto is __________.an escaped moon of Neptuneusually the most distant planet in the solar systemthe name of Mickey Mouse’s dogall of the above2. The people of Iceland __________.a country located just outside the Arctic Circlework to keep their culture aliveclaim to be descendants of the Aztecsthe capital, Reykjavik, where arms talks have been held
19 Summary & Next Steps Summary Module 3: Item Specifications Next Steps Developed performance measure items/tasks that match the applicable blueprints.Next StepsModule 4: Scoring Keys and RubricsGiven the items/tasks created, develop scoring keys and rubrics.
20 Assessment Literacy Series -Module 4-Scoring Keys and RubricsIMT Orientation Draft 02Sept11-CS
21 Objectives Participants will: Develop scoring keys for all multiple choice items outlined with in the blueprint.Develop scoring rubrics for constructed response items/tasks that reflect a performance continuum.
22 Helpful Tools Participants may wish to reference the following: Guides Handout #6 – Scoring Key ExampleHandout #7 – Rubric ExamplesTemplatesTemplate #5-Scoring Key-RubricStuffPerformance Task Framework
23 Module 4: Scoring Keys and Rubrics Rubrics for Constructed Response Outline of Module 4Module 4: Scoring Keys and RubricsScoring Key for MCProcess StepsRubrics for Constructed ResponseSample AnswersScoring Criteria
25 Scoring KeysScoring Keys typically contain elements such as the following:Performance measure name or unique identifierGrade/CourseAdministration timeframe (e.g., fall, mid-term, final examination, spring, etc.)Item tag and typeMaximum points possibleCorrect answer (MC) or rubric with sample answers or anchor papers (SCR & ECR)Assessment Literacy Series
26 Scoring Key Example [Handout #6] Assessment NameGrade/CourseAdministrationUAI (Unique Assessment Identifier)Algebra I End-of-CourseAlgebra IPost-test (Spring)Item #Item TagItem TypePoint ValueAnswer10001.MTH.ALGI.POST.MC-LV1-8EE2MCC20002.MTH.ALGI.POST.MC-LV1-ACED1B30003.MTH.ALGI.POST.MC-LV1-8EE140004.MTH.ALGI.POST.MC-LV1-ASSE1D50005.MTH.ALGI.POST.MC-LV2-ASSE1A60006.MTH.ALGI.POST.MC-LV2-7RP370007.MTH.ALGI.POST.MC-LV2-ACED180008.MTH.ALGI.POST.MC-LV2-ACED190009.MTH.ALGI.POST.MC-LV1-ACED1100010.MTH.ALGI.POST.MC-LV1-AREI3110011.MTH.ALGI.POST.MC-LV1-FIF1120012.MTH.ALGI.POST.MC-LV2-ACED1130013.MTH.ALGI.POST.ECR-LV2-NRN2SCRSee Scoring Rubric140014.MTH.ALGI.POST.ECR-LV2-FIF1ECRAssessment Literacy Series
27 Scoring Keys Scoring keys for MC items: Answers within the key must represent a single, correct response.Answers should be validated once the key is developed to avoid human error.Validating answers should be done prior to form review.Items changed during the review stage must be revalidated to ensure the scoring key is correct.Assessment Literacy Series
28 Process Steps [Template #4] Enter the assessment information at the top of the Scoring Key.Record the single, correct answer during item development. For SCR and ECR items/tasks, the scoring rubrics should be referenced in the answer column and put in the correct rubric table on the Rubric Template.Record the item number, item tag, item type, and point value.Record the MC answers in the answer column. For each CR item, include the general scoring rubric and sample response for each point value.Repeat Steps 1-4 until all items/tasks on the blueprint are reflected within the Scoring Key.
29 QA ChecklistAll items/tasks articulated on the blueprint are represented within the Scoring Key.MC items have been validated to ensure only one correct answer among the possible options provided exists.MC answers do not create a discernible pattern.MC answers are “balanced” among the possible options.Scoring Key answers are revalidated after the final operational form reviews are complete.
31 Holistic vs. Analytic Rubric Scoring Holistic ScoringProvides a single score based on an overall determination of the student’s performanceAssesses a student’s response as a whole for the overall qualityMost difficult to calibrate different ratersAnalytic ScoringIdentifies and assesses specific aspects of a responseMultiple dimension scores are assignedProvides a logical combination of subscores to the overall assigned scoreAssessment Literacy Series
32 Rubric Scoring Considerations Describe whether spelling and/or grammar will impact the final score.Avoid using words like “many”, “some”, and “few” without adding numeric descriptors to quantify these terms.Avoid using words that are subjective, such as “creativity” or “effort”.Avoid subjective adjectives such as “excellent” or “inadequate”.Assessment Literacy Series
33 SCR Rubric Example [Handout #7] General Scoring Rubric2 pointsThe response gives evidence of a complete understanding of the problem. It is fully developed and clearly communicated. All parts of the problem are complete. There are no errors.1 pointThe response gives evidence of a reasonable approach but also indicates gaps in conceptual understanding. Parts of the problem may be missing. The explanation may be incomplete.0 pointsThere is no response, or the work is completely incorrect or irrelevant.Assessment Literacy Series
34 SCR Rubric Example [Handout #7] Sample Response: “In two complete sentences, explain why people should help save the rainforests.”2 pointsThe student’s response is written in complete sentences and contains two valid reasons for saving the rainforest.“People must save the rainforest to save the animals’ homes. People need to save the rainforest because we get ingredients for many medicines from there.”1 pointThe student’s response contains only one reason.“People should save the rainforest because it is important and because people and animals need it.”Assessment Literacy Series
35 Rubrics for ECR TasksCreate content-based descriptions of the expected answer for each level of performance on the rubric.Provide an example of a fully complete/correct response along with examples of partially correct responses.Reference the item expectations in the rubric.Make the rubric as clear and concise as possible so that other scorers would assign exact/adjacent scores to the performance/work under observation.Assessment Literacy Series
36 ECR Rubric Example [Handout #7] General Scoring Rubric4 pointsThe response provides all aspects of a complete interpretation and/or a correct solution. The response thoroughly addresses the points relevant to the concept or task. It provides strong evidence that information, reasoning, and conclusions have a definite logical relationship. It is clearly focused and organized, showing relevance to the concept, task, or solution process.3 pointsThe response provides the essential elements of an interpretation and/or a solution. It addresses the points relevant to the concept or task. It provides ample evidence that information, reasoning, and conclusions have a logical relationship. It is focused and organized, showing relevance to the concept, task, or solution process.2 pointsThe response provides a partial interpretation and/or solution. It somewhat addresses the points relevant to the concept or task. It provides some evidence that information, reasoning, and conclusions have a relationship. It is relevant to the concept and/or task, but there are gaps in focus and organization.1 pointThe response provides an unclear, inaccurate interpretation and/or solution. It fails to address or omits significant aspects of the concept or task. It provides unrelated or unclear evidence that information, reasoning, and conclusions have a relationship. There is little evidence of focus or organization relevant to the concept, task, and/or solution process.0 pointsThe response does not meet the criteria required to earn one point. The student may have written on a different topic or written "I don't know."Assessment Literacy Series
37 ECR Rubric Example [Handout #7] Sample Response: “List the steps of the Scientific Method. Briefly explain each one.”4 pointsAsk a Question- Ask a question about something that you observe: How, What, When, Who, Which, Why, or Where?Do Background Research- Use library and Internet research to help you find the best way to do things.Construct a Hypothesis- Make an educated guess about how things work.Test Your Hypothesis- Do an experiment.Analyze Your Data and Draw a Conclusion- Collect your measurements and analyze them to see if your hypothesis is true or false.Communicate Your Results- Publish a final report in a scientific journal or by presenting the results on a poster.3 pointsAsk a QuestionDo Background Research-Use library and Internet research.Construct a Hypothesis- An educated guess about how things work.Analyze Your Data and Draw a ConclusionCommunicate Your Results2 pointsDo Background ResearchConstruct a HypothesisTest Your Hypothesis1 pointAsk a Question, Hypothesis, Do an Experiment, Analyze Your Data0 points“I don’t know.”Assessment Literacy Series
38 Process Steps [Template #4] Create the item/task description for the student.Using a “generic” rubric, begin by modifying the language using specific criteria expected in the response to award the maximum number of points.Next, determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described]Using the “sample” rubric, create an example of a correct or possible answer for each level in the rubric.In review, ensure the item/task description for the student, the scoring rubric, and the sample rubric are aligned.
39 QA ChecklistCR items/tasks have scoring rubrics that reflect a performance continuum.CR items/tasks include sample responses for each level of performance.CR scoring rubrics are clear and concise.CR scoring rubrics include all dimensions (aspects) of the tasks presented to the students.CR scoring rubrics avoid including non-cognitive (motivation, timeliness, etc.) or content irrelevant attributes.
40 Summary & Next Steps Summary Module 4: Scoring Keys & Rubrics Developed a scoring key and rubrics for all items.Next StepsModule 5: Operational Forms & Administrative GuidelinesGiven the items/tasks developed, create an operational form with applicable administrative guidelines.
41 Assessment Literacy Series -Module 5-Operational Forms and GuidesIMT Orientation Draft 02Sept11-CS
42 Objectives Participants will: Construct operational forms (performance measures) from developed items by:Organizing and sequencing items/tasksConstructing formAssigning item tagsAligning to the Test Specifications & BlueprintDevelop teacher guidelines to administer the performance measure.
43 Helpful Tools Participants may wish to reference the following: Guides Handout #8 – Operational Form ExampleHandout #9 – Administrative Guidelines ExampleTemplatesTemplate #5 – Operational FormTemplate #6 – Administrative GuidelinesOther “Stuff”Performance Task Framework-Full
44 Outline of Module 5Module 5: Operational Forms & Administrative GuidelinesOperational FormsProcess StepsAdministrative Guidelines
46 Operational FormsAll developed items will now be inserted into the proper format: the Operational Form.Operational Forms contain the following components:Cover pageStudent information sectionTest-taker instructionsItem directionsItems/TasksAnswer options/response areaItem tags
47 Operational Forms (cont.) An operational form is the actual performance measure that will be given to the student.The “forms” displays the sequence of items/passages/prompts the test-taker is to engage and respond to during the testing period.
48 Operational Forms (cont.) Test Specifications & Blueprint should be the same for the pre-test and the post-test.Some complex measures, the “operational form”, may be a single prompt or task.All items must have an item tag.Consider the necessity for charts, graphs, images, figures, etc.Avoid creating a “queuing” or sequence bias among items/tasks.
49 Process Steps [Template #5] Complete the cover design and demographic information necessary to reflect the performance measure developed and the context by which it will be used.Organize the items/tasks in a sequence that will maximize student engagement.Add directions to the test-taker instructions detailing the requirements for different types of items/tasks, especially for CR items/tasks.Add item tags.Screen and refine draft form to minimize “blank space”, verify picture, graph, table, and figure placement in relationship to the item/task, and ensure MC answer options do not drift from one page to the next.
50 QA ChecklistDirections state what to do, where and how to respond, and the point value for items/tasks.Items/tasks on the operational form reflect the blueprint.Form layout minimizes “white space”, “crowding of items”, and distance from stem to response options/placement.Forms are free from unclear, small, or unnecessary images, figures, charts, or graphs.
52 Administrative Guidelines Administrative Guidelines are composed of three phases:Preparing for Test AdministrationConducting Test AdministrationPost Administration
53 Administrative Guidelines (cont.) Preparing for Test AdministrationEstablish the testing window.Specify what forms are needed.Review administration steps and student directions.Review the performance measures before administration.Establish a secure testing environment.Determine the process for make-up tests.
54 Administrative Guidelines (cont.) Conducting Test AdministrationGive directions for how to distribute forms.If possible, provide a script for the test administrator to use during administration.Provide directions for start/stop times, completing student information, and beginning and ending the testing session.
55 Administrative Guidelines (cont.) Post AdministrationProvide instructions for collecting and securing forms.Provide scoring instructions.Include how results are reported.Determine overall score and/or performance status (e.g., met minimum status).
56 Administrative Guidelines (cont.) Administrative guidelines must provide sufficient detail so that testing conditions are comparable between classrooms and across different schools.Activities within the testing environment (before, during, and after) must be articulated in a logical sequence.Test security and handling of scored materials must be outlined.
57 Process StepsUse the Administrative Guidelines Template to create administrative steps for before, during, and after the test.Explain any requirements or equipment necessary, including accommodations. State any ancillary materials (e.g., calculators) needed/allowed by the test-taker.Identify the approximate time needed to take the test.Include detailed “scripts” articulating exactly what is to be communicated to the test-takers, with particular emphasis on performance tasks.Include procedures for scoring, make-up tests, and handling completed tests.
58 QA ChecklistGuidelines explain administrative steps before, during, and after testing.Requirements for completing the performance measure items/tasks, including conditions, equipment, and material, are included.Approximate time needed to complete the overall performance measure is provided.Detailed “scripts” articulate the information to be communicated in a standardized manner.
59 Summary Module 5: Operational Forms & Administrative Guidelines Created operational forms (which reflected the developed blueprint) in conjunction with test administration guidelines.Next StepsModule 6: Form ReviewsGiven the operational form, scoring key, and administrative guidelines, evaluate the quality of the materials, with particular emphasis on items/tasks within the performance measure.
60 Joining the SLO Professional Learning Community on SAS. Go to the SAS home page(www.pdesas.org)Log in with your user name and password.If you do not have an account with SAS you will have to create one.
61 Enter your information on the log in page and submit.
62 Once you have successfully logged in and are at the SAS home page, go to Teacher Tools in the upper right corner.
63 Click on Teacher Tools, this will provide you with various tools. Locate the button labeled “My Communities.”
64 This will open your membership to various Professional Learning Communities. If you are not a member of the Student Learning Objectives PLC, type SLO in the search bar.
65 Once a member of the SLO community you will have access to communication with all other members and a calendar of upcoming events.
66 (This is located at the bottom of the SLO community page.) Along with posting questions to the entire community you have access to the Digital Repository, in which SLO training materials and supporting documents are located.(This is located at the bottom of the SLO community page.)
67 PDE POC: Mr. O David Deitz Contact InfoPDE POC: Mr. O David DeitzRIA POC: Dr. JP Beaudoin