Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

Matching the Assessment Method to the Learning Target
Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
1 The Professional Teaching and Learning Cycle in Action.
Assessment Literacy Series
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
Assessment “We assess students not merely to evaluate them, but to improve the entire process of teaching and learning.” - Douglas B. Reeves, Making Standards.
Checking For Understanding
Michigan Assessment Consortium Common Assessment Development Series Module 6 – The Test Blueprint.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Internet password: aea8success Conceptbasedinstruction.weebly.com Password: Consortium1.
AP US History - APUSH Exam Information
Building Common Assessments Rebecca Bush Ionia County ISD Kimberly Young Ionia County ISD/MDE And Using!
Day 2: Formative Assessment Design & Tools. Design Process.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
How to build effective WORD WALLS and PERFORMANCE TASKS
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Assessment Issues Jeffrey Oescher.
Classroom Assessment LTC 5 ITS REAL Project Vicki DeWittDeb Greaney Director Grant Coordinator.
Asking the Right Questions Assessing Language Skills 2008 Presentation to ATESL Central Local Sheri Rhodes, Mount Royal College.
CASL: Target -Method Match Statesville Middle School January 13, 2009.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Become familiar with constructed response in the law Understand the meaning of constructed response Review open-response questions Know when to assess.
Choose Your Own Adventure. Introduction Use this as a guide when working on a Smarter Balanced question.
Assessment Literacy Series 1 -Module 3- Item Specifications.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Target -Method Match Selecting The Right Assessment.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Biology Partnership Assessment Pedagogy Session Saturday, September 29, 2012 Dr. Susan Butler.
Assessment and Testing
Determining Student Mastery: Achieving learning potential using assessment Drew Maerz Asheboro City Schools July 8, 2014.
How to Teach English Language Learners Tips and Strategies
Grand Island K-8 SCIENCE Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
M= Math in STEM College and Career Ready- Conference Summer, 2015.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
FCAT Reading & Mathematics Performance Task Responses Strategies for getting a top score Division of Performance Accountability Department of Assessment.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
The Constructed Response Assessment The Journey Continues.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
Reviewing, Revising and Writing Effective Social Studies Multiple-Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards.
SBAC-Mathematics November 26, Outcomes Further understand DOK in the area of Mathematics Understand how the new SBAC assessments will measure student.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Components of Quality Classroom Assessments Mitch Fowler School Data Consultant, Calhoun ISD
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Linking Curriculum, Instruction and Assessment in Mathematics March 10, 2016 CONNECTICUT STATE DEPARTMENT OF EDUCATION.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Quality Assurance processes
Writing in Math: Digging Deeper into Short Constructed Responses
Developing Quality Assessments
Presentation transcript:

Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp

Who Are We? What one question might you ask to explore your notion? Let’s speculate about the people in the room

Who Are We? Next 5 minutes, circulate around room Name, professional role, district Ask your question without comment or clarification and record data Analyze data What assumptions might you make about people in the room? To what extent did your question give you the data you were looking for?

Outcomes Participants will recognize the need for quality classroom assessments including elements such as: ◦Standard/Item Alignment ◦Balance of Representation ◦Target/Method Match ◦Quality Items ◦Test Blueprints Participants will reflect on and modify (where needed) current assessments

Setting the stage… Table activity Protocol – Chalk Talk Center of chart paper write Quality Assessments Without comment… What are your hunches about the need to build high quality assessments?

Key Questions Think…Pair…Share ◦What elements are necessary to assure quality common assessments?  List these qualities  Discuss why these are important

Rubric Review Validity Checklist ◦Standard Alignment ◦Balance of Representation ◦Target/Method Match ◦Quality Items ◦Test Blueprints

Deconstructing Assessments Activity ◦Break it apart…see what you have…

Deconstructing Debrief Are the assessment items tightly aligned with the standards? Are there an equal number of items per standard? If not, is there “rationale”? Are there enough items per standard to determine mastery?

Break Please return in 15 minutes

Kinds of Learning Targets Knowledge – facts and concepts we want students to know Reasoning – using what they know to reason and solve problems Skills – students use their knowledge and reasoning to act skillfully Products – use knowledge, reasoning, and skills to create a concrete product

Method of Assessment Selected Response/Short Response ◦True/false, multiple-choice, matching, fill-in- the-blank, short answers Extended Response ◦Essays, research reports and lab reports Performance ◦Public performances, investigations Personal Communication through conversation/observation ◦Oral exams, interviews, discussion groups

Target-Method Match How well does your method of assessment match your target? Target to be Assessed Assessment Method Selected Response/ Short-Response Response/ Short-Response Extended - Response Performance Assessment Personal Communication Knowledge Reasoning Performance Skills Products

Target-Method-Match With an “elbow” partner….. TMM Chart – fill in the grid Which way may be best? ◦Good match ◦Partial match ◦Not a good match

Target-Method Match How well does your method of assessment match your target? Target to be Assessed Assessment Method Selected Response/ Short-Response Response/ Short-Response Extended - Response Performance Assessment Personal Communication Knowledge Good match Not a good match Partial match Reasoning Good match Performance Skills Not a good match Good match Partial match Products Not a good match Partial match Good match Not a good match

Target Method Match

In looking at items on your assessment, might there be an assessment method that could better capture evidence of student understanding of a standard? What will you stay mindful of as you rethink or develop assessment items to assess standards?

Quality Items

General Item Writing Guidelines Remember – the development of good items takes time and careful thought

Parts of a Multiple-Choice Item Stem 22 What is the perimeter of a rectangular vegetable garden with dimensions 6 feet by 8 feet? A 48 ft B* 28 ft C 24 ft D 14 ft Distractors (Incorrect Options or Foils) Correct answer (Key)

General Guidelines 1. Align items to a standard 2. Target the appropriate Depth of Knowledge 3. Use clear, concise language 4. Use correct grammar 5. Use appropriate reading level 6. Avoid the use of the word “you” and “I” 7. Avoid using synonyms within the item

General Guidelines 8. Avoid unnecessary complexity 9. Don’t assume prior knowledge 10. Remember: Formatting matters: font sizes, distractor placement, etc.

Guidelines About Writing Stems

Two Types of Multiple Choice Stems  Open-ended statement, followed by (usually) 3 or 4 answer choices  Closed question, followed by (usually) 3 or 4 answer choices 26

Examples Open-ended stem Closed question stem 27 One of the factors of x 2 – 5x – 36 is ___ A x + 3 B x - 4 C x + 6 D* x - 9 Which of the following is a factor of x 2 – 5x – 36 ? A x + 3 B x - 4 C x + 6 D* x - 9

General Guidelines 11. Stuff the stem 12. Avoid redundancy 13. Avoid the use of negatives 14. Avoid clues in the stem 15. Ensure lead materials are essential to the item 28 Multiple Choice Items

Stems With a Graphic/Stimulus The stem and leaf plot gives the ages of the people who answered survey questions after buying a pair of roller blades on an Internet auction. Lead Stem Leaf Key: 3 2 means 32 What is the median age of the people who answered the survey questions? Question

Guidelines for Writing Response Options

Parts of a Multiple Choice Item Stem What is the perimeter of a rectangular vegetable garden with dimensions 6 feet by 8 feet? A 48 ft B* 28 ft C 24 ft D 14 ft Distractors (Incorrect Options) Correct answer (Key) 31

16. Use direct, clear terminology 17. Use plausible distractors/foils 18. Use equal length and detail 19. Make all distractors equally attractive 20. Organize the options 32 General Guidelines for Writing Response Options

21. Have only one correct answer 22. Do not use overlapping answers 23. Vary placement of option choices 24. Good Items are fair items 25. Avoid using “All of the Above” and “None of the Above” 33 General Guidelines

Constructed Response Items

 A constructed response item is an assessment item that asks students to apply knowledge, skills, and/or critical thinking abilities to real-world, standards driven performance tasks.  It requires a brief written response from students. They often have several parts. Students have to write, draw, and/or explain their answers.

Constructed Response Items Sometimes called “open-response” items, constructed response items are so named because they ask students to use their own thinking and background knowledge to develop answers without the benefit of any suggestions or choices. Constructed response items often have more than one way to correctly answer the question.

Constructed Response Items Constructed Response items are good to use when you want students to: ◦Show their work ◦Explain a process ◦Compete a chart ◦Perform a geometric construction ◦Construct a graph ◦Identify patterns ◦Write an essay 37

Constructed Response Items Tie constructed response items to higher- level objectives. This type of item is good to use when you want to test a skill that can’t be easily measured with a selected-response item. 38 HOTS

Constructed Response Items Two primary types of constructed response items: ◦Brief Constructed Response ◦Extended Constructed Response

Brief Constructed Response Items Require about 1-3 minutes of student response time Usually represented by one of the following 5 formats: ◦Fill in the blank ◦Short Answer ◦Label a diagram ◦Visual representation ◦Show your work

Extended Response Items Extended response items require students to provide evidence of understanding regarding a situation that demands more than a selected response or brief constructed response. They usually involve minutes of student response time

Extended Response Items  May require students to reflect and respond in a variety of contexts, such as: Write an essay from a prompt Take a position on a specific topic and support their stance Solve a problem Respond to findings of an investigation and/ or experiment Respond to written text

Extended Response Items Guidelines ◦Carefully word directions and prompts ◦Allow sufficient time for completion ◦Have resources necessary for item completion on hand and ready for use ◦Share with students elements/characteristics of a successful response, where appropriate

Constructed Response Items When designing common assessments, use a variety of brief constructed response items…(these could include short answers, fill-in-the-blank, show- your-work and visual representations) as well as extended constructed response items. Be sure they are aligned to appropriate (usually higher-level) learning targets 44

Constructed Response Items The item should be clear and specific about what students should do. A Constructed response item may have several questions. Allow for more than one way for students to respond. 45

Constructed Response Items Include necessary visual representations such as charts, graphs, pictures, short readings, and cartoons. Determine points possible for each item. 46

Constructed Response Items  Usually constructed response items are worth 2 or more points depending on the difficulty of the item and the task being performed.  Design a scoring protocol, based on the number of points possible, for each constructed-response item.  Scoring protocols are typically specific to each individual item

Quality Item Hunt Dot Activity ◦Green= Item is good to go ◦Yellow = Item may need to be modified ◦Red = Item is not well-written and needs to be scrapped

Assessment Blueprints

Reflective Questions Did you develop your assessment blueprint prior to developing your common assessment? ◦Why is this desirable? Have you reviewed or modified your test blueprint during the development process? Does your or will your assessment reflect your intended blueprint?

Reassemble Assessments Activity ◦Put it back together…make changes as needed…

Now What? Next Steps

Wrap Up; Evaluation Ticket out the door…

Contact Information Dr. Ed Roeber, Michigan State U Dr. Ellen Vorenkamp, Wayne RESA Kimberly Young, MDE/BAA