Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp.

Similar presentations


Presentation on theme: "Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp."— Presentation transcript:

1 Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp

2 Who Are We? What one question might you ask to explore your notion? Let’s speculate about the people in the room

3 Who Are We? Next 5 minutes, circulate around room Name, professional role, district Ask your question without comment or clarification and record data Analyze data What assumptions might you make about people in the room? To what extent did your question give you the data you were looking for?

4 Outcomes Participants will recognize the need for quality classroom assessments including elements such as: ◦Standard/Item Alignment ◦Balance of Representation ◦Target/Method Match ◦Quality Items ◦Test Blueprints Participants will reflect on and modify (where needed) current assessments

5 Setting the stage… Table activity Protocol – Chalk Talk Center of chart paper write Quality Assessments Without comment… What are your hunches about the need to build high quality assessments?

6 Key Questions Think…Pair…Share ◦What elements are necessary to assure quality common assessments?  List these qualities  Discuss why these are important

7 Rubric Review Validity Checklist ◦Standard Alignment ◦Balance of Representation ◦Target/Method Match ◦Quality Items ◦Test Blueprints

8 Deconstructing Assessments Activity ◦Break it apart…see what you have…

9 Deconstructing Debrief Are the assessment items tightly aligned with the standards? Are there an equal number of items per standard? If not, is there “rationale”? Are there enough items per standard to determine mastery?

10 Break Please return in 15 minutes

11

12

13 Kinds of Learning Targets Knowledge – facts and concepts we want students to know Reasoning – using what they know to reason and solve problems Skills – students use their knowledge and reasoning to act skillfully Products – use knowledge, reasoning, and skills to create a concrete product

14 Method of Assessment Selected Response/Short Response ◦True/false, multiple-choice, matching, fill-in- the-blank, short answers Extended Response ◦Essays, research reports and lab reports Performance ◦Public performances, investigations Personal Communication through conversation/observation ◦Oral exams, interviews, discussion groups

15 Target-Method Match How well does your method of assessment match your target? Target to be Assessed Assessment Method Selected Response/ Short-Response Response/ Short-Response Extended - Response Performance Assessment Personal Communication Knowledge Reasoning Performance Skills Products

16 Target-Method-Match With an “elbow” partner….. TMM Chart – fill in the grid Which way may be best? ◦Good match ◦Partial match ◦Not a good match

17 Target-Method Match How well does your method of assessment match your target? Target to be Assessed Assessment Method Selected Response/ Short-Response Response/ Short-Response Extended - Response Performance Assessment Personal Communication Knowledge Good match Not a good match Partial match Reasoning Good match Performance Skills Not a good match Good match Partial match Products Not a good match Partial match Good match Not a good match

18 Target Method Match

19 In looking at items on your assessment, might there be an assessment method that could better capture evidence of student understanding of a standard? What will you stay mindful of as you rethink or develop assessment items to assess standards?

20 Quality Items

21 General Item Writing Guidelines Remember – the development of good items takes time and careful thought

22 Parts of a Multiple-Choice Item Stem 22 What is the perimeter of a rectangular vegetable garden with dimensions 6 feet by 8 feet? A 48 ft B* 28 ft C 24 ft D 14 ft Distractors (Incorrect Options or Foils) Correct answer (Key)

23 General Guidelines 1. Align items to a standard 2. Target the appropriate Depth of Knowledge 3. Use clear, concise language 4. Use correct grammar 5. Use appropriate reading level 6. Avoid the use of the word “you” and “I” 7. Avoid using synonyms within the item

24 General Guidelines 8. Avoid unnecessary complexity 9. Don’t assume prior knowledge 10. Remember: Formatting matters: font sizes, distractor placement, etc.

25 Guidelines About Writing Stems

26 Two Types of Multiple Choice Stems  Open-ended statement, followed by (usually) 3 or 4 answer choices  Closed question, followed by (usually) 3 or 4 answer choices 26

27 Examples Open-ended stem Closed question stem 27 One of the factors of x 2 – 5x – 36 is ___ A x + 3 B x - 4 C x + 6 D* x - 9 Which of the following is a factor of x 2 – 5x – 36 ? A x + 3 B x - 4 C x + 6 D* x - 9

28 General Guidelines 11. Stuff the stem 12. Avoid redundancy 13. Avoid the use of negatives 14. Avoid clues in the stem 15. Ensure lead materials are essential to the item 28 Multiple Choice Items

29 Stems With a Graphic/Stimulus The stem and leaf plot gives the ages of the people who answered survey questions after buying a pair of roller blades on an Internet auction. Lead 29 12345671234567 7 7 8 8 8 8 9 0 1 3 3 5 2 4 6 7 1 3 3 5 7 9 9 4 5 2 0 3 Stem Leaf Key: 3 2 means 32 What is the median age of the people who answered the survey questions? Question

30 Guidelines for Writing Response Options

31 Parts of a Multiple Choice Item Stem What is the perimeter of a rectangular vegetable garden with dimensions 6 feet by 8 feet? A 48 ft B* 28 ft C 24 ft D 14 ft Distractors (Incorrect Options) Correct answer (Key) 31

32 16. Use direct, clear terminology 17. Use plausible distractors/foils 18. Use equal length and detail 19. Make all distractors equally attractive 20. Organize the options 32 General Guidelines for Writing Response Options

33 21. Have only one correct answer 22. Do not use overlapping answers 23. Vary placement of option choices 24. Good Items are fair items 25. Avoid using “All of the Above” and “None of the Above” 33 General Guidelines

34 Constructed Response Items

35  A constructed response item is an assessment item that asks students to apply knowledge, skills, and/or critical thinking abilities to real-world, standards driven performance tasks.  It requires a brief written response from students. They often have several parts. Students have to write, draw, and/or explain their answers.

36 Constructed Response Items Sometimes called “open-response” items, constructed response items are so named because they ask students to use their own thinking and background knowledge to develop answers without the benefit of any suggestions or choices. Constructed response items often have more than one way to correctly answer the question.

37 Constructed Response Items Constructed Response items are good to use when you want students to: ◦Show their work ◦Explain a process ◦Compete a chart ◦Perform a geometric construction ◦Construct a graph ◦Identify patterns ◦Write an essay 37

38 Constructed Response Items Tie constructed response items to higher- level objectives. This type of item is good to use when you want to test a skill that can’t be easily measured with a selected-response item. 38 HOTS

39 Constructed Response Items Two primary types of constructed response items: ◦Brief Constructed Response ◦Extended Constructed Response

40 Brief Constructed Response Items Require about 1-3 minutes of student response time Usually represented by one of the following 5 formats: ◦Fill in the blank ◦Short Answer ◦Label a diagram ◦Visual representation ◦Show your work

41 Extended Response Items Extended response items require students to provide evidence of understanding regarding a situation that demands more than a selected response or brief constructed response. They usually involve 20-30 minutes of student response time

42 Extended Response Items  May require students to reflect and respond in a variety of contexts, such as: Write an essay from a prompt Take a position on a specific topic and support their stance Solve a problem Respond to findings of an investigation and/ or experiment Respond to written text

43 Extended Response Items Guidelines ◦Carefully word directions and prompts ◦Allow sufficient time for completion ◦Have resources necessary for item completion on hand and ready for use ◦Share with students elements/characteristics of a successful response, where appropriate

44 Constructed Response Items When designing common assessments, use a variety of brief constructed response items…(these could include short answers, fill-in-the-blank, show- your-work and visual representations) as well as extended constructed response items. Be sure they are aligned to appropriate (usually higher-level) learning targets 44

45 Constructed Response Items The item should be clear and specific about what students should do. A Constructed response item may have several questions. Allow for more than one way for students to respond. 45

46 Constructed Response Items Include necessary visual representations such as charts, graphs, pictures, short readings, and cartoons. Determine points possible for each item. 46

47 Constructed Response Items  Usually constructed response items are worth 2 or more points depending on the difficulty of the item and the task being performed.  Design a scoring protocol, based on the number of points possible, for each constructed-response item.  Scoring protocols are typically specific to each individual item

48 Quality Item Hunt Dot Activity ◦Green= Item is good to go ◦Yellow = Item may need to be modified ◦Red = Item is not well-written and needs to be scrapped

49 Assessment Blueprints

50 Reflective Questions Did you develop your assessment blueprint prior to developing your common assessment? ◦Why is this desirable? Have you reviewed or modified your test blueprint during the development process? Does your or will your assessment reflect your intended blueprint?

51 Reassemble Assessments Activity ◦Put it back together…make changes as needed…

52 Now What? Next Steps

53 Wrap Up; Evaluation Ticket out the door…

54 Contact Information Dr. Ed Roeber, Michigan State U roeber@msu.edu 517.432.0427 Dr. Ellen Vorenkamp, Wayne RESA vorenke@resa.net 734.334.1318 Kimberly Young, MDE/BAA youngk1@michigan.gov 517.373.0988


Download ppt "Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp."

Similar presentations


Ads by Google