Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Common Assessments APRIL 2, 2013. Session Targets  This session will address: ◦The key factors to consider when developing common assessments.

Similar presentations


Presentation on theme: "Building Common Assessments APRIL 2, 2013. Session Targets  This session will address: ◦The key factors to consider when developing common assessments."— Presentation transcript:

1 Building Common Assessments APRIL 2, 2013

2

3 Session Targets  This session will address: ◦The key factors to consider when developing common assessments ◦Strategies and tools for building common assessments ◦Understanding of the rationale for and process of using common assessments

4 4

5 5

6 Observation/Evidence Danielson Framework Domains 1.Planning and Preparation 2.Educational Environment 3.Delivery of Service 4.Professional Development Student Performance of All Students in the School Building in which the Nonteaching Professional Employee is Employed District Designed Measures and Examinations Nationally Recognized Standardized Tests Industry Certification Examinations Student Projects Pursuant to Local Requirements Student Portfolios Pursuant to Local Requirements Non Teaching Professional Employee Effectiveness System in Act 82 of 2012 Effective 2014-2015 SY 6

7 Observation/ Evidence Domains 1.Strategic/Cultural Leadership 2.Systems Leadership 3.Leadership for Learning 4.Professional and Community Leadership Building Level Data Indicators of Academic Achievement Indicators of Closing the Achievement Gap, All Students Indicators of Closing the Achievement Gap, Subgroups Academic Growth PVAAS Other Academic Indicators Credit for Advanced Achievement Correlation Data Based on Teacher Level Measures PVAAS Elective Data/SLOs District Designed Measures and Examinations Nationally Recognized Standardized Tests Industry Certification Examinations Student Projects Pursuant to Local Requirements Student Portfolios Pursuant to Local Requirements Principal Effectiveness System in Act 82 of 2012 Effective 2014-2015 SY

8 Common Assessments in the Context of PLCs (DuFour, DuFour, Eaker & Many, 2006)  What is it we expect students to learn?  How will we know when students have learned it?  How will we respond when students don’t?  How will we respond when students do?

9 What Do We Mean by “Common Assessments”? Any assessment given by two or more instructors with the intention of collaboratively examining the results for: ◦Shared learning ◦Instructional planning for individual students, and/or ◦Curriculum, instruction, and/or assessment modifications

10 What Constitutes Effective Classroom Assessment? Key Points:  Assessment is information, not scores. (Scores are accountability.)  Assessment is best done early and often. (Assessment at the end is accountability.)

11 What Constitutes Effective Classroom Assessment? Assessment that:  Provides evidence of student performance relative to content and performance standards  Provides teachers and students with insight into student errors and misunderstanding  Helps lead the teacher and/or team directly to action

12 What is the Difference between Multiple Choice and Constructed Response Items?

13 Why are we writing items? A.to populate a state item bank B.to build common assessments C.to work with other teachers D.to gain experience writing The Stem CORRECT ANSWER Distractor

14 Constructed Response Items Constructed response items require students to provide a written response. These questions typically ask students to describe, explain, critique, or evaluate the scenario provided in the stimulus Constructed response items have multiple correct responses. However, there may be specific content that is required in the response to receive full credit. Constructed response items can assess one or more benchmarks and range from low to high complexity. Constructed response items are scored using a rubric

15 Algebra Keystone Module 1Module 2 Assessment Anchors Covered Operations & Linear Equations & Inequalities Linear Functions & Data Organizations # of Eligible Content 1815 # of Multiple Choice 18 # of Constructed Response 33 Multiple Choice Points 18 Constructed Response Points 12 60% 40%

16 Biology Keystone Module 1Module 2 Assessment Anchors Covered Cells and Cell Processes Continuity and Unity of Life # of Eligible Content 1622 # of Multiple Choice 24 # of Constructed Response 33 Multiple Choice Points 24 Constructed Response Points 99 73% 27%

17 Literature Keystone Module 1Module 2 Assessment Anchors Covered Fiction LiteratureNonfiction Literature # of Eligible Content 2531 # of Multiple Choice 17 # of Constructed Response 33 Multiple Choice Points 17 Constructed Response Points 99 65% 35%

18 Keystone Exam Difficulty The Keystone Exam Test Questions will almost all be at the Level 2 or 3 from Webb’s Depth of Knowledge Which leads us to the next question…

19 What Is Webb’s Depth of Knowledge?

20 Norman Webb

21 Webb’s Depth of Knowledge Norman Webb developed a process and criteria for systematically analyzing the alignment between standards and assessments. Webb’s Depth of Knowledge Model can be used to analyze the cognitive complexity required for the student to master a standard or complete an assessment task.

22 Webb’s Depth of Knowledge Cognitive complexity refers to the cognitive demand associated with a test item. The Depth of Knowledge level of the item is determined by the complexity of the mental processing that the student must use to answer the item.

23 Webb’s four levels of complexity Level 1 - Recall - Recall of a fact, information, or procedure Level 2 - Basic Application - of Skill/Concept - Use of information, conceptual knowledge, procedures, two or more steps, etc. Level 3 - Strategic Thinking - Requires reasoning, developing a plan or sequence of steps; has some complexity; more than one possible answer; generally takes less than 10 minutes to do. Level 4 - Extended Thinking - Requires an investigation; time to think and process multiple conditions of the problem or task; and more than 10 minutes to do non- routine manipulations.

24 Cognitive Complexity/Verbs Low Cognitive Complexity—Level 1 Remember Recall Memorize Recognize Translate Rephrase Describe Explain Repeat Moderate Cognitive Complexity—Level 2 Apply Execute Solve Connect Classify Break Down Distinguish Compare Contrast High Cognitive Complexity—Levels 3 & 4 Integrate Extend Combine Design Create Judge Perform Value Assess

25 Same Verb—Three Different DOK Levels DOK 1- Describe three characteristics of metamorphic rocks. (Requires simple recall) DOK 2- Describe the difference between metamorphic and igneous rocks. (Requires cognitive processing to determine the differences in the two rock types) DOK 3- Describe a model that you might use to represent the relationships that exist within the rock cycle. (Requires deep understanding of rock cycle and a determination of how best to represent it)

26 Does Depth of Knowledge Mean Difficulty?

27 DOK is NOT about difficulty Difficulty Difficulty is a reference to how many students answer an item correctly How many of you know the definition of exaggerate? DOK Low = Recall If all or most of you know the definition, this item is an easy one. How many of you know the definition of illeist? DOK Low = Recall If most of you do not know the definition, this item is a difficult one.

28 Complexity vs. Difficulty The Depth of Knowledge levels are based on the complexity of the mental processes the student must use to find the correct answer, not the difficulty of the item!

29 How Does Webb’s DOK Relate to Bloom’s Taxonomy

30 COGNITIVE LEVEL COMPARISON MATRIX: BLOOM AND WEBB Action Words BLOOMWEBB Remember define, identify, name, select, state, order (involves a one-step process) 1.0 define, identify, name, select, state, order (involves a one-step process) Understand convert, estimate, explain, express, factor, generalize, give example, identify, indicate, locate, picture graphically (involves a 2- step process) 2.0 apply, choose, compute, employ, interpret, graph, modify, operate, plot, practice, solve, use (involves a two-step process) Apply apply, choose, compute, employ, interpret, graph, modify, operate, plot, practice, solve, use, (involves a three-or-more step process) Analyze compare, contrast, correlate, differentiate, discriminate, examine, infer, maximize, minimize, prioritize, subdivide, test 3.0 compare, contrast, correlate, differentiate, discriminate, examine, infer, maximize, minimize, prioritize, subdivide, test Evaluate arrange, collect, construct, design, develop, formulate, organize, set up, prepare, plan, propose, create, experiment and record data 4.0 arrange, collect, construct, design, develop, formulate, organize, set up, prepare, plan, propose, create, experiment and record data Create appraise, assess, defend, estimate, evaluate, judge, predict, rate, validate, verify

31 Do You Have Item Writing Rules?

32 Item Writing Rules www.bhspd.weebly.com

33 Is This Worth the Effort? Yes, It’s Worth the Effort! The Bar is being raised! The Common Core changes the game. The Keystone Exams change the game. The world our students will enter is FAR MORE COMPLEX. We need to raise the Level of our game!

34 Groups GROUP A English Social Studies World Languages Physical Education/Health Session 1: Library Session 2: LGI GROUP B Mathematics Business Science Fine & Practical Arts Session 1: LGI Session 2: Library

35 Your Mission Due on May 6, 2013 Updated Common Final Exams Common Adapted ELL Final Exams Common Adapted Final Exams for students with special needs Cover Page Identify course and teachers who will be administering the assessments. Describe changes that were made. Identify percentages of multiple choice and constructed response questions. Identify percentages for each of Webb’s cognitive levels of knowledge.

36 Test Your Depth of Knowledge www.bhspd.weebly.com


Download ppt "Building Common Assessments APRIL 2, 2013. Session Targets  This session will address: ◦The key factors to consider when developing common assessments."

Similar presentations


Ads by Google