Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implications for Instruction

Similar presentations


Presentation on theme: "Implications for Instruction"— Presentation transcript:

1 Implications for Instruction
Connecting the ICS-M (CCSS) & Smarter Balanced Assessment While this may be new and somewhat technical information, the feeling is that it is necessary to understand the big picture of the assessment system that will reflect the ICS in order to understand how instruction can be informed from this content. The ICS and assessments are all about what happens in the classroom. Smarter Balanced Consortium emphasizes a Balanced Assessment System consisting of Formative, Interim, and Summative assessment that are designed to all inform instruction. Pg 8 The Consortium Theory of Action for Assessment Systems: As stated in the Smarter Balanced Assessment Consortium‘s (Smarter Balanced) Race to the Top proposal, ―the Consortium‘s Theory of Action calls for full integration of the learning and assessment systems, leading to more informed decision-making and higher-quality instruction, and ultimately to increased numbers of students who are well prepared for college and careers.‖ (p. 31) Learning Targets Know how the assessment is being developed and why Understand the instructional implications of this information; should also inform instruction Any assessment needs to reflect the core shifts of the standards AND prepare students for summative assessments. Nichole Hall, Assessment Coordinator Nancy Thomas Price, Formative/Interim Assessment Coordinator

2 Learning Targets specifics of the SBAC balanced assessment system.
I understand…. specifics of the SBAC balanced assessment system. how the Smarter Balanced Content Specifications reflect standards, assessment, and instruction ways to use knowledge of the new assessment design, tools and mathematics content to inform classroom instruction. how sample “items/tasks” can be used ‘thoughtfully’ to elicit evidence about student understanding and teacher instruction. formative assessment strategies that can be used during instruction. We differentiate Learning Targets from Success Criteria. (or what the learner will do.)

3 Success Criteria I will show understanding of the major components of the SBAC balanced assessment system and the vocabulary used in that system specifications.(DOK1) – Knowledge Target I can specify the assessment claim, target, standard, and depth of knowledge for an item or task . (DOK 2)- Skill target Explain response (DOK 3) I can use the concepts presented to design classroom instruction and assessments that elicit evidence of student learning, (DOK3) Reasoning, Product targets I can relate these concepts to other content areas and / or grade levels (DOK 4) Product target We want you to take responsibility for your learning. Clarification: Learning Targets are not intended to be deconstructed standards. They are statements of the focal points of each day’s lesson. Target – Assessment Method Match DOK 1- Knowledge Target: Selected Response: Matching, sorting, multiple choice DOK 2- Skill Target: Performance Task: DOK 3- (Explain) – Reasoning Target: (Written response) (Design) - Product Target: (Performance Task) DOK 4- (Synthesize) – product Target

4 Depth of Knowledge DOK 1: Recall or identify a fact, definition, term; focus on initial comprehension DOK 2: Demonstrate conceptual information through explanation, interpretation (make some decisions) DOK 3: Strategic Thinking, reasoning, planning, using evidence, interpreting DOK 4: Extended thinking, relate concepts to other content areas, new situations...synthesize, show new perspective DOK 1: Items require only verbatim recall or simple understanding of a word or phrase. So formative assessment will be simple understanding of concept or definition. DOK 2: Requires initial comprehension AND subsequent processing … paraphrase, summarize, infer, classify. Items (formative assessment) may require application of concepts covered in Level 1. DOK 3: Requires deep knowledge. Explain, generalize, connect ideas and support thinking.

5 Shared understanding Balanced Assessment System
Formative Interim Summative Attributes of Formative Assessment Clarify Intended Learning Elicit Evidence Interpret Evidence Act on Evidence Formative Assessment is a deliberate process used by teachers and students during instruction that provides actionable feedback that is used to adjust ongoing teaching and learning strategies to improve students’ self-assessment, reflection, and attainment of curricular learning targets/goals. Can be planned and intentional or impromptu. Not graded, just for learning. Never more important than now, bec. teachers need feedback on their teaching and students need feedback on their learning.

6 Four Attributes Clarify Elicit Intended Evidence Learning Interpret
Practice implemented by teachers in collaboration with students Learning Targets: Students understand expectations & goals Success Criteria: Observable & measurable evidence of learning Multiple ways to elicit evidence Can be planned for or spontaneous Inform: Teacher, peers, or self Four Attributes The formative assessment process attributes are: Clarify Intended Learning Elicit Evidence Act on Interpret Timely and Actionable Provide feedback : Where are students at in regards to learning targets Make adjustments to instruction To determine where students are in regards to the learning target and success criteria Can be conducted by the teacher, student, or both Let’s take a moment to go over each of the four attributes: FIRST CLICK Clarify Intended Learning Practice implemented by teachers in collaboration with students Learning Targets: Students understand expectations & goals Success Criteria: Observable & measurable evidence of learning Elicit Evidence Multiple ways to elicit evidence Can be planned for or spontaneous Inform: Teacher, peers, or self Interpret Evidence To determine where students are in regards to the learning target and success criteria Can be conducted by the teacher, student, or both Act on Evidence Timely and Actionable Provide feedback : Where are students at in regards to learning targets Make adjustments to instruction

7 Documents we will be using:
Idaho Core Standards (CCSS) for Mathematics SBAC Math Content Specifications (Draft) & Item Specifications Cognitive Rigor Matrix Article (Hess, Carlock, Jones, and Walkup) and Matrix The following are the documents that we will be referencing today; Familiarity with the documents is essential for the understanding of the material presented.

8 Content Specifications
Create a bridge between standards, assessment, and instruction Organize the standards around major constructs and big ideas Further describe what students should learn and be able to do to demonstrate evidence of their learning One of the main documents we will be looking at is the Smarter Balanced Content Specifications for Mathematics. Using Smarter Balanced language, the content specifications for both Mathematics and ELA, do three very specific things. Create a bridge between standards, assessment and instruction; so right from the start, the content specifications outline the implications for instruction. Organize the standards around major constructs and big ideas; this can be seen through the claims which will be discussing in the next slide. Express what students should learn and be able to do; the content specifications outline the evidence that is required of students, in order to be successful on the Smarter Balanced assessment.

9 Content Specifications Claims (p. 18)
Conceptual Framework Claims are the broad statements of the assessment system‘s learning outcomes, each of which requires evidence that articulates the types of data/observations that will support interpretations of competence towards achievement of the claims. Interpretations are spelled out in the Achievement Level Descriptors. If you turn to p. 18 in the Mathematics Content Specification, you will see an Overview of the Content Claims and Evidence. The Smarter Balance Content Specifications’ focus is on claims. A claim is a statement of what the students know and can do based on the provided evidence, which is the content claim. We are saying that this student can….. Because of the evidence he / she produces .. (assess claim) Content Specifications Claims (p. 18)

10 Conceptual Framework Evidence = Assessment Target
Assessment Targets align with Standards The Standard is the Content to be learned while the Assessment Target describes in greater detail, the evidence that will show the content has been mastered.

11 Relationship among Content Claims, Content Categories, Assessment Targets, and Standards (p.8 ALD Document) From page 8 in the ALD document, you will see a figure that we are going to reveal in pieces on the current slide; the figure provides a graphic representation of the relationship among claims, content categories, assessment targets, and the related standards in the CCSS. FIRST CLICK (purple boxes/column) The CCSS are the focal point of a claim. SECOND CLICK (green boxes/column) Claims must be proven by evidence. The assessment targets map the standards into statements of evidence of what will be collected throughout the assessment to prove the claim. For math…assessment targets are derived from the content cluster headings or mathematical practices dependent on the number of the claim. (We’ll get more specific about this later) The standards are the what… The assessment targets are the how…. 3. THIRD CLICK – Content Categories … in Math are the Domains. FOURTH CLICK A claim is a broad assertion of the outcomes we expect to see from students of what students should know and be able to do, if the CCSSs are taught with consistency and in the manner in which they are intended. In a claim…we can say…this is true about a student, IF these (CCSS) can be demonstrated in this way (ASSESSMENT TARGETS).

12 Evidence-Centered Design
Content Mapping and Content Specifications for Assessment Design; pp. 14 & 15 Smarter Balanced is committed to using evidence-centered design in its development of assessments; summative, interim and formative. . The assessment triangle is used to illustrate the fundamental components of ECD, which articulates the relationships among learning models (Cognition), assessment methods (Observation), and inferences one can draw from the observations made about what students truly know and can do (Interpretation). These three key elements should underlie any assessment system. Application of the assessment triangle contributes to better test design and can be used to gain insights into student learning. The Assessment Triangle was first presented by Pellegrino, Chudowsky, and Glaser in Knowing What Students Know/KWSK (NRC, 2001.) The Assessment Triangle (NRC, 2001)

13 Content Categories & Assessment Targets Proposed Reporting Categories
The Assessment Triangle as Represented in the Content Specifications (pp ) Content Categories & Assessment Targets Proposed Reporting Categories Claims & Rationale The Assessment Triangle as Represented in the Content Specifications (p.14-15) The claims and rationale represent the cognition part of the assessment triangle. For each claim and rationale there is a section in the content specifications, representing the observation corner of the triangle, where a narrative description lays out the kinds of evidence that would be sufficient to support the claim. This evidence is the content categories and more specifically the Assessment Targets linked to the Common Core standards. Finally, the interpretation corner of the triangle is represented by a section for each claim that lists the “Proposed Reporting Categories” that the assessment will provide. This is still in draft form, once test blueprints are finalized, scoring specifics will be finalized. The Assessment Triangle (NRC, 2001)

14 Proposed Reporting Categories
The Assessment Triangle as Represented in the Content Specifications (pp ) Proposed Reporting Categories Achievement Level Descriptors Achievement level descriptors are a means of describing performance on a standardized test in terms of levels or categories of performance. The Achievement Level Descriptors are aligned to the CCSS and the Smarter Balanced assessment claims. ALDs are used to explain the knowledge, skills, and processes students display at predetermined levels of achievement. The Assessment Triangle (NRC, 2001)

15 POLL Check for Understanding
Activity DOK 1 POLL Check for Understanding Assessment Claim Assessment Target Standards Evidence-Centered Design Depth of Knowledge Formative Assessment Selected response. For each statement I make, decide if it relates to Claim, Target or Standard, etc… The relationship among learning models (Cognition), assessment methods (Observation), and inferences that can be drawn from the observations made about what student truly know and can do. (interpretation). (ECD) WHAT students should know and be able to do…. The content. (Standards) A PROCESS used during instruction to elicit evidence of student learning with intent to act on that evidence. (FA) Broad statements of the assessment system’s learning outcomes. (CLAIMS) In the ICS-M these are the Domain Headings ( Content Category) Evidence that will show the content and application of knowledge has been mastered (Assessment Targets)

16 Content Specifications Mathematics
Claims & Assessment Targets

17 Review: Content Standards & the Mathematical Practices
Have out the content standards and the content specfications

18 Relationship among Content Claims, Content Categories, Assessment Targets, and Standards
Nancy spoke the relationship among claims, content categories, assessment targets, and the related standards in the CCSS, and how the standards are the focal point of a claim. Before we talk about the content specifications for the Smarter Balanced assessment, we need to review briefly how the CCSS are organized and some important points especially in regards to the Standards for Mathematical practices.

19 How to read the grade level standards Standards – p. 5
Standards define what students should understand and be able to do. Clusters are groups of related standards. Note that standards from different clusters may sometimes be closely related, because mathematics is a connected subject. Domains are larger groups of related standards. Standards from different domains may sometimes be closely related. Number and Operations in Base Ten NBT Use place value understanding and properties of operations to perform multi-digit arithmetic. 1. Use place value understanding to round whole numbers to the nearest 10 or 100. 2. Fluently add and subtract within 1000 using strategies and algorithms based on place value, properties of operations, and/or the relationship between addition and subtraction. 3. Multiply one-digit whole numbers by multiples of 10 in the range (e.g., 9 × 80, 5 × 60) using strategies based on place value and properties of operations Domain I will also be referring to the CCSS-M as the math content standards, and the Idaho Core Standards, throughout this breakout session. The Common Core State Standards for Math are arranged in the way you see presented on the slide. FIRST CLICK Standards define what students should understand and be able to do. Related standards are grouped together in a cluster. SECOND CLICK Above the cluster, you will see the main standard or the cluster herding that the specific standards fall under. The cluster heading conveys the larger intent of a group or cluster of standards. THIRD CLICK Above that, you will see the Domain, which signifies a larger group of related standards or cluster headings. Cluster Heading Cluster of Standards

20 Grouping the Standards for Mathematical Practice Standards – p. 6-8
The same across all grade levels Different levels of expertise that educators should seek to develop in their students The Practices are how students are expected to engage in items or tasks Overarching habits of mind of a productive mathematical thinker. On pp. 6-8 of the CCSS-M, you will find the Standards for Mathematical Practices; the mathematical practices are not what the teacher is doing, but rather: different levels of expertise that mathematics educators, at all grade levels, should seek to develop in their students. The Mathematical Practices are how students are expected to engage in items or tasks. A nice way to look at the mathematical practices, is through a graphic representation of how the practices can be grouped. This grouping is provided by William McCallum, from the University of Arizona, one of the writers of the CCSS-M. Keep in mind as we briefly go over each mathematical practice, that when creating classroom tasks, educators need to pay close attention to the practices, just as much as the content standards, but in a different way. As educators, we need to create tasks the elicit the mathematical practices in our students. The practices are not necessarily innate in our students, we need to provide them with opportunities where they are required to engage in the mathematical practices in order to get to a solution. So, as we go through and analyze some of the problem types from the Smarter Balanced Assessment, keep a close eye on how the items and or tasks elicit the mathematical practices in our students. William McCallum – The University of Arizona

21 Mathematics Assessment Claims
Keep the terms we just reviewed in mind as we go through the Content Specifications; the Content Specifications will relate directly back to the cluster headings, domains, standards, and mathematical practices.

22 Relationship among Content Claims, Content Categories, Assessment Targets, and Standards
While we have been stressing the fact that the standards are the focal point of the claims, in the Smarter Balanced Content Standards, and that the progression is from the standards to the target, on to the content categories and then the claims; I am now going to go further in- depth in each area starting with the claims, the content categories and so on…..because that is how they are presented in the Smarter Balanced Content Standards. Please take out the content standards, we will be referring to different pages out of the booklet as we go along. The content standards that are in front of you is not the complete document, but rather just the areas we will be looking at today. Also, be sure to have the content standards close by; we will be working between the two documents so you can see how they relate to each other.

23 Math Claims Content Specifications, p. 25
Concepts & Procedures “Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency.” Claim #2 Problem Solving “Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem solving strategies.” Claim #3 Communicating Reasoning “Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others.” Claim #4 Modeling and Data Analysis “Students can analyze complex, real-world scenarios and can construct and use mathematical models to interpret and solve problems.” On this slide you will see the four claims for Mathematics, as found in the content standards. Just a quick note, ELA has four claims, as well, but they are different.

24 Math Claims Content Specifications, p. 25
Concepts & Procedures “Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency.” MP 5, 6, 7, & 8 As you look over these claims you can see how the mathematical practices are interwoven throughout…. Whereas, the content standards are addressed through all four claims, certain mathematical practices lend themselves more nicely to some claims over other. Looking at Claim 1, students will be attending to conceptual and procedural mathematics, so Mathematical Practices 5, 6, 7 and 8 will be incorporated into items that are aligned to claim 1. So students will need to attend to precision, engage in modeling and seeing structure and generalizing when solving items aligned to claim 1.

25 Math Claims Content Specifications, p. 25
Problem Solving “Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem solving strategies.” MP 1, 5 & 8 Looking at Claim 2, students will be making productive use of knowledge and problem solving strategies, so Mathematical Practices 1, 5, and 8 will be incorporated into items that are aligned to claim 2. So with problem solving, students will be required to make sense of problems and persevere in solving them; they will still need to model and use appropriate tools, while seeing structure and generalizing, as in claim 1. The main difference between claim 1 and 2 is that in claim 1, the student is using procedural knowledge, rather than engaging in problem solving strategies.

26 Math Claims Content Specifications, p. 25
Communicating Reasoning “Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others.” MP 3 & 6 Looking at Claim 3, students will need to communicate their reasoning by constructing viable arguments to support their reasoning, while being asked to critique the reasoning of others; so, Mathematical Practices 3 and 6 will be incorporated into items that are aligned to claim 3. Students will need to reason and explain solutions, while attending to precision; this is the precision needed when communicating reasoning, as discussed earlier during the review of the mathematical practices.

27 Math Claims Content Specifications, p. 25
Modeling and Data Analysis “Students can analyze complex, real-world scenarios and can construct and use mathematical models to interpret and solve problems.” MP 2, 4, & 5 Looking at Claim 4, students will be attending to analysis of real-word scenarios, while constructing and using mathematical models to support their solutions, so Mathematical Practices 2, 4 and 5 will be incorporated into items that are aligned to claim 1. Students will need to attend model their reasoning, while explaining.

28 Can a Task or Assessment Item be Aligned to More Than One Claim? STOP!
“Mathematics is not a collection of separate strands or standards, though it is often partitioned and presented in this manner. Rather, mathematics is an integrated field of study. Viewing mathematics as a whole highlights the need for studying and thinking about the connections within the discipline, as reflected both within the curriculum of a particular grade and between grade levels.” Principles and Standards for School Mathematics; NCTM, 2000

29 Item Types SR CR ER PT TE Selected Response Constructed Response
Extended Response Performance Task Technology Enhanced SR CR ER This directly relates to the assessment and how you develop classroom tasks; dependent on whether an item or task is aligned to one or multiple claims, will affect how the item is presented. Obviously, performance task items will have more than one claim aligned to them. This, in turn, means that an item or task with multiple claims aligned to it, will have multiple standards and mathematical practices aligned, as well. (Point out the different item types, be sure to let educators know that selected response is so much more than just multiple choice. A selected response item can have multiple answers or can be a drag-and-drop type question.) We will be looking at examples of the different item types shortly and you will see how a selected response item differs from a multiple choice problem. PT TE

30 Claim Alignment Practice
ACTIVITY DOK 2-3 Claim Alignment Practice We are going to do some claim alignment practice. In front of you, you will find a packet of claim alignment practice, which we will use repeatedly, not only to align the claims, but next the assessment targets, and then DOK, which Nancy will talk about shortly. Explain Four Corners Activity – Be sure to take a pen or pencil with you.

31 Claim Alignment Practice: Grade 7 Item
ACTIVITY DOK 2-3 Part A Determine if each of these statements is always true, sometimes true, or never true. Circle your response. The sum of the measures of two complementary angles is 90°. Always True Sometimes True Never True Part B For each statement you chose as “Sometimes True,” provide one example of when the statement is true and one example of when the statement is not true. Your examples should be a diagram with the angle measurements labeled. If you did not choose any statement as “Sometimes True,” write “None” in the work space below.

32 What claim does this item align to? POLL
ACTIVITY DOK 2-3 What claim does this item align to? POLL Claim 3: Communicating Reasoning Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others. Secondary Claim? Claim 1: Concepts and Procedures Students can explain and apply mathematical concepts and carry out mathematical procedures with precision and fluency.

33 Claim Alignment Practice: Grade 3 Item

34 What claim does this item align to? POLL
ACTIVITY DOK 2-3 What claim does this item align to? POLL Claim 1: Concepts and Procedures Students can explain and apply mathematical concepts and carry out mathematical procedures with precision and fluency. Secondary Claim? No secondary claim

35 Assessment Targets

36 Relationship among Content Claims, Content Categories, Assessment Targets, and Standards

37 Relationship among Content Claims, Content Categories, Assessment Targets, and Standards
Claim 1: Cluster Headings Math Domains in the Content Standards Claims 2, 3, & 4: Derived from the Mathematical Practices

38 Claim 1 - Assessment Targets Content Specifications, p. 30
m = major a/s = additional/supplemental Assessment Target: CCSS-M Cluster Heading Content Category: CCSS-M Domain (p. 23) Description of Evidence Let’s turn to page 30 in the Content Specifications and look at the claims a little more deeply. For each claim, there are a series of assessment targets; these assessment targets provide the educator with a description of the evidence required of students to score proficient on the Smarter Balanced Assessment. Claim 1’s assessment targets are directly related to the Common Core Content Standards; they are based on the standards for each grade level that will be assessed, So there are assessment targets in claim 1 for grades 3-8 and grade 11. So, if we take a look at p. 30, claim 1 begins with the assessment targets for third grade. FIRST CLICK on slide You will notice that the assessment targets are grouped by Content Category. The content categories are the Mathematic Domains found in the content standards. As seen on the current slide, the first content category for grade 3 is Operations and Algebraic Thinking, (Turn to p. 23 in the CCSS and you will see the first content category in the content specifications, is the first domain in the CCSS for third grade.) SECOND CLICK Right under each content category, you will find the assessment targets that correspond to it. The corresponding assessment targets are the cluster headings in the content standards. So, target A is Represent and Solve Problems involving Multiplication and Division, which is in the content standards, as the first cluster heading, under the domain Operations and Algebraic Thinking, for the third grade. THIRD CLICK You will also see that for each assessment target, it is indicated as to whether the target has been identified as either a major or supplemental target; however identifying whether the standard being addressed is either from a major or additional/supplemental cluster is not to say that any assessment target can be neglected. All standards are encompassed in the assessment. This is just identifying that the claim being looked at will have a stronger focus on certain standards designated with a “m” for major. The assessment target we are looking at has been identified as a major standard on the assessment, for claim 1, for third grade. In the evidence description of ALMOST ALL assessment targets that have been identified as MAJOR for claim 1, the description also includes how the cluster heading will be assessed in additional claims. FOURTH CLICK You will also see the depth of knowledge the assessment target requires of students; we will talk more about depth of knowledge shortly. FIFTH CLICK Included within each assessment target is a description of the evidence that students will need to produce to show proficiency on the assessment. So for Target A, the evidence descriptor will let educators know what the items or tasks on the assessment will be asking students to do in relation to the cluster heading identified as the target. This will aid educators in developing lesson plans and units. Claim 1 assessment targets will be assessed primarily through selected response, constructed response, and extended response item types on the Smarter Balanced Assessment; claim one assessment targets will also be assessed as secondary targets on performance task item types. In part II of this webinar on Thursday, we will be looking more closely at item specifications and item types on the Smarter Balanced Assessment. Depth of Knowledge

39 Relationship between the Idaho Core Standards & the Content Specifications
CCSS, p. 23 Content Specs, p. 30 Domain = Content Category Cluster Heading 1 = Target A Standards = Evidence Show relationship between the two documents. Cluster Heading 2 = Target B

40 CLAIM 1 – Grade 3: Content Categories, Assessment Targets, and Standards
Claim 1: Concepts & Procedures Operations & Algebraic Thinking Target A 3.OA.1 3.OA.2 3.OA.3 3.OA.4 Target B 3.OA.5 3.OA.6 Target C 3.OA.7 Target D 3.OA.8 Number & Operations - Base Ten Target E 3.NBT.1 3.NBT.2 3.NBT.3 Number & Operations - Fractions Target F 3.NF.1 3.NF.2 3.NF.3 Measurement & Data Target G 3.MD.1 3.MD.2 Target H 3.MD.3 3.MD.4 Target I 3.MD.5 3.MD.6 3.MD.7 Target J 3.MD.8 Geometry Target K 3.G.1 Nichole: (Also mention how this looks in HS regardless of Traditional or Integrated) So in order to give you an example, we used the Smarter Balanced content specifications to show the relationship among the content claims, content categories, assessment targets, and the standards. On this slide, you will see the framework for claim one, for grade 3. Claim 1 addresses all five domains that are included in each grade, in this case, third grade. Look on p. 22 in the CCSS; and the assessment targets are the cluster headings within the CCSS, which all the standards fall under.

41 Claim 1: Concepts & Procedures Operations & Algebraic Thinking
CLAIM 1 – Grade 3: Content Categories, Assessment Targets, and Standards Claim 1: Concepts & Procedures Operations & Algebraic Thinking Target A: Represent and Solve Problems Involving Multiplication and Division. 3.OA.1 3.OA.2 3.OA.3 3.OA.4 So let’s take a closer look. If we start with the standards in the CCSS booklet, p. 23, the first four standards all fall under the first cluster heading, Represent and solve problems involving multiplication and division, which is Target A in the content specification (open up to p. 30) which falls under the first domain, operations and algebraic thinking, which is the content category in the content specifications, which falls under claim 1 As you can see, multiple standards can align to one target; so an assessment item that aligns to target A for the third grade, is addressing four specific standards.

42 CLAIM 1 – Grade 3: Content Categories, Assessment Targets, and Standards
Claim 1: Concepts & Procedures Operations & Algebraic Thinking Target D: Solve problems involving the four operations, and identify and explain patterns in arithmetic 3.OA.8 Number & Operations - Base Ten Target E: Use place value understanding and properties of operations to perform multi-digit arithmetic. 3.NBT.1 3.NBT.2 3.NBT.3 Or one standard can align to more than one assessment target. If you look at 3.NBT.2: Fluently add and subtract within 1000 using strategies and algorithms based on place value properties of operations and/or the relationship between addition and subtraction. You can see how this standard can fall under both targets.

43 Claims 2, 3, & 4 – Assessment Targets Content Specifications, p. 59
Relevant Verbs Aligned to the Mathematical Practices Depth of Knowledge Description of Engagement Now let’s look at Claims 2 through 4, which begin on p. 59 in the content specifications; remember, these assessment targets are derived from FIRST CLICK the Standards for Mathematical Practices, in the Common Core Content Standards. While the mathematical practices are not the assessment targets, word-for-word, like the cluster headings are the targets in claim 1, once an educator is familiar with the mathematical practices, it is pretty evident which assessment target have been derived from which mathematical practice. So for target A, under Claim 2, “apply mathematics to solve well-posed problems arising in everyday life, society, and the workplace,” if you look at the Mathematical Practices on pp. 6-8 of the Content Standards, you can see that target A is derived from Mathematical Practice 4, Model with Mathematics. While Claim 1 describes the conceptual and procedural side of the content standards, Claims 2, 3, and 4, describe the mathematical practices students will be expected to engage in to solve the content. This is not to say that the mathematical practices will not be a part of assessment items aligned to claim 1; the mathematical practices have a part in all four claims. Claims 2, 3, and 4 go more in-depth with the mathematical practices, producing item types that have multiple parts and are more performance task in nature. SECOND CLICK: Please take note of the relevant verbs identified for each claim. These verbs will aid educators in determining which cluster headings will be assessed through claims 2, 3 and/or 4. We will look at this more, shortly. THIRD CLICK The Depth of Knowledge is also identified for each assessment target. FOURTH CLICK Also present is the description of how the student will be engaged in the mathematical practices. Claims 2, 3 & 4 assessment targets will be assessed primarily through constructed response, extended response, and performance task item types on the Smarter Balanced Assessment.

44 Claims 2, 3, & 4: Relevant Verbs
Problem Solving Understand Solve Apply Describe Illustrate Interpret Analyze Communicating Reasoning Explain Justify Prove Derive Assess Modeling & Data Analysis Model Construct Compare Investigate Build Estimate Summarize Represent Evaluate Extend Claim 2 Claim 3 Claim 4 Now let’s go back and take a look at the relevant verbs that will aid educators in determining which cluster headings will be assessed through claims 2, 3 and/or 4 While it is relatively easy to match up the content standards to the assessment targets in claim 1, it is a little different with claims 2, 3, and 4. The table on this slide lists the relevant verbs identified for each claim; notice that some verbs, such as understand and analyze fall in multiple claims. The relevant verbs will help in identifying the Math Content Clusters and/or standards that will be assessed for each claim. Once the claim is identified through the verbs, the assessment targets that align are dependent on the primary domain and cluster heading being assessed. Certain content lends itself more nicely with certain assessment targets under claims 2, 3 or 4. We will look at determining assessments targets through relevant verbs more closely later on in the webinar when we discuss implications for instruction.

45 Update #2 to the Content Specifications for Mathematics
Provide a more explicit connection between the content standards and Claim 2 (Problem Solving), Claim 3 (Communicating Reasoning), and Claim 4 (Modeling and Data Analysis) by including the standards for each claim by grade level. Content standards for each grade that support the collection of evidence for Claim 4.

46 Assessment Target Alignment Practice
Shoulder Partner Activity: Write answer and white boards.

47 POLL: Assessment Target: Grade 7 Item
ACTIVITY DOK 2 Part A Determine if each of these statements is always true, sometimes true, or never true. Circle your response. The sum of the measures of two complementary angles is 90°. Always True Sometimes True Never True Part B For each statement you chose as “Sometimes True,” provide one example of when the statement is true and one example of when the statement is not true. Your examples should be a diagram with the angle measurements labeled. If you did not choose any statement as “Sometimes True,” write “None” in the work space below.

48 ACTIVITY DOK 2 What assessment target does this item align to? Please type your answer in the Question Box 3 B: Construct, autonomously, chains of reasoning that will justify or refute propositions or conjectures. 3 F: Base arguments on concrete referents such as objects, drawings, diagrams, and actions. 1 F: Solve real-life and mathematical problems involving angle measure, area, surface area, and volume. Domain / Content Category is Geometry AND this is a 7th grade question POLL Claim 3 Page in Content Specs Turn to your partner or jot a note Claim 1 Page 44 in Content Specs

49 Assessment Target: Grade 3 Item
ACTIVITY DOK 2

50 ACTIVITY DOK 2 This item aligns to which assessment target? Write your answer in the question box. 1 I: Geometric measurement: understand concepts of area and relate area to multiplication and to addition. Domain: Measurement and Data pg 32

51 What are the Implications for Instruction?
Claims & Assessment Targets

52 Implications for Instruction
Claims & Assessment Targets Statements about what we claim students can do if we see the specified evidence. Statements that describe evidence of how proficiency on the content standards will be assessed Remember: The assessment targets in the Smarter Balanced Content Specifications provide the educator with statements of evidence of how proficiency on the content standards will be assessed.

53 Implications for Instruction
Classroom Lesson: Identify Idaho content standards and cluster headings Do you have a lesson that you feel aligns to the identified standards and cluster headings? If so… Content Specifications: Find the corresponding assessment target(s) and claim(s). Classroom Task: What will the evidence of a proficient student look like? Content Specifications: Does the evidence descriptor from the assessment target(s) match? Make adjustments, if needed. Don’t forget to address the mathematical practices through Claims 2, 3 and 4 and the evidence described in the corresponding assessment targets.

54 Cognitive Rigor Matrix
Depth of Knowledge Cognitive Rigor Matrix

55 Cognitive Rigor Matrix p. 92-93
The Common Core State Standards require.. high-level cognitive demand and students are asked to demonstrate deeper conceptual understanding through the application of content knowledge and skills to new situations and sustained tasks. If you turn to pp. 92 and 93 in the Math Content Specifications, you will find a snapshot of the Cognitive Rigor Matrix, which is displayed on the current slide. The Cognitive Rigor Matrix was developed by Karin Hess, Senior Associate with the National Center for Assessment, and several of her colleagues. . So, for each Assessment Target in the content specifications, the depth(s) of knowledge (DOK) that the student needs to bring to the item/task has been identified, using the Cognitive Rigor Matrix. This matrix draws from two widely accepted measures to describe cognitive rigor: - Bloom's (revised) Taxonomy of Educational Objectives (first column) - Webb’s Depth-of-Knowledge Levels. (first row) The Cognitive Rigor Matrix has been developed to integrate these two models as a strategy for analyzing instruction, for influencing teacher lesson planning, and for designing assessment items and tasks. There is also a cognitive rigor matrix for English Language Arts, that focuses on the processes students will use and the cognitive demand of ELA content. For more detailed information; please read Hess’ article that was listed at the beginning of this PP.

56 Cognitive Rigor Matrix – Karin Hess
The following video gives a brief two minute overview of the Cognitive Rigor Map. Download this to a FLASH Drive

57 Assessment Target: Grade 7 Item
Activity – DOK 2 Part A Determine if each of these statements is always true, sometimes true, or never true. Circle your response. The sum of the measures of two complementary angles is 90°. Always True Sometimes True Never True Part B For each statement you chose as “Sometimes True,” provide one example of when the statement is true and one example of when the statement is not true. Your examples should be a diagram with the angle measurements labeled. If you did not choose any statement as “Sometimes True,” write “None” in the work space below.

58 What depth of thinking? POLL
Activity – DOK 2 What depth of thinking? POLL DOK 1: Recall and Reproduction DOK 2: Basic Skills and Concepts DOK 3: Strategic Thinking/Reasoning DOK 4: Extending Thinking WHY? What type of thinking would student’s use to solve this problem, (i.e. cognitive process) What type of thinking?

59 Depth of Knowledge: Grade 3 Item
Activity – DOK 2 Depth of Knowledge: Grade 3 Item How could you modify this task to be DOK 1?

60 WHY? What depth of thinking? What type of thinking?
Activity – DOK 2 What depth of thinking? DOK 1: Recall and Reproduction DOK 2: Basic Skills and Concepts DOK 3: Strategic Thinking/Reasoning DOK 4: Extending Thinking WHY? What type of thinking would student’s use to solve this problem, (i.e. cognitive process) What type of thinking?

61 Depth of Knowledge Implications for Instruction
We are now going to see an example of how the content specifications can aid educators in creating or extending existing lesson plans to meet the rigor of the common core.

62 Implications for Instruction
Cognitive Rigor Matrix Understand how Bloom’s Taxonomy and Webb’s DOK are alike, yet different. Examine the DOK required for different tasks Categorize selected assignments and learning activities Apply to test design and item development process The Cognitive Rigor Matrix Helpful in explaining how the two conceptual models are alike, yet different. The matrix allows us to see how the process a student uses to solve a problem relates to the cognitive demand involved. Allows educators to examine the depth of understanding required for different tasks that might seem at first glance to be at comparable levels of complexity. So the two items we just looked at appeared to be at comparable levels at first glance, and then after taking a look at the cognitive rigor matrix, we could see that the cognitive demand had increased a bit in the second problem. Allows educators to categorize and examine selected assignments/learning activities that appear prominently in curriculum and instruction Further understanding of the Smarter Balanced test design and item development process.

63 Implications for Instruction
Classroom Lesson: Identify Idaho content standards and cluster headings Do you have a lesson that you feel aligns to the identified standards and cluster headings? If so… Content Specifications: Find the corresponding assessment target(s) Classroom Task: What will the evidence of a proficient student look like? Content Specifications: Does the evidence descriptor from the assessment target(s) match? Make adjustments, if needed. Classroom Task: What is the cognitive process and demand required of the students? Cognitive Rigor Matrix: Identify the DOK level aligned to the task. Make adjustments if needed Adding on to our implications for instruction from earlier, the cognitive rigor matrix has its own place in making adjustments. Analyze how the lesson or unit engages a student’s cognitive process and what is the cognitive demand involved. Determine what DOK level the task matches up with on the cognitive rigor matrix. Make adjusts if necessary.

64 Item Specification & Tasks

65 Understanding the Item Specifications
Content & Grade Type of Question SR – Selected Response Claim Domain : RP – Ratios and Proportional Relationships Assessment Target for Grade Level – Target A Internal Number - 181 Claim – C1, C2, C3, or C4 & Target The most important thing to know is how to decipher the item specifications for each item. A good portion of a item specification can be readily interpreted; however, we will briefly go over the item specifications layout for each item type throughout this Power Point. The first item we will be looking at is a selected response item. FIRST CLICK Up at the top of each item specification, you will see a long identification title. The Mat.06 is identifying the content and grade level, so this is a Math item for grade 6 The next few letters are identifying the item type; this is a selected response item, so you will see SR. The number following the item type is the claim number of the Content Specifications, this item focuses primarily on Claim 1. The next few characters signify the CCSS-M Domain, or more accurately, the concept category from Claim 1. This item focuses on the Ratios and Proportional Relationships concept category or domain; RP. Next you will see a character which designates which assessment target the item requires evidence from; this item is focused on assessment target A, under the Ratios and Proportional Relationships content category, grade 6, claim 1 of the content specifications. The following number, 181, is a Smarter Balanced internal identification number. And the following characters identify the claim and assessment target; so for this item, C1, Claim 1, and TA, target A. You will see that in the first two rows, there is a repeat of the item title.

66 Understanding the Item Specifications
“Claims are the broad statements of the assessment system’s learning outcomes, each of which requires evidence that articulates the types of data/observations that will support interpretations of competence towards achievement of the claims.” p. 18 – Content Specifications The third row of the table then identifies the primary claim and relays what the claim states. Just a reminder from Part I of this webinar: “Claims are the broad statements of the assessment system’s learning outcomes, each of which requires evidence that articulates the types of data/observations that will support interpretations of competence towards achievement of the claims.” p. 18 – Content Specifications

67 Understanding the Item Specifications
Assessment Target “Cluster level headings of the standards in the CCSS-M are used in order to allow for the creation and use of assessment tasks that require proficiency in a broad range of content and practices. Use of more fine-grained descriptions would risk a tendency to atomize the content, which might lead to assessments that would not meet the intent of the standards.” Content Specs., p. 20 Next, the Assessment Targets are identified; remember the assessment targets are the grade specific cluster headings from the CCSS-M. The cluster headings are used to allow for the creation and use of assessment tasks that require proficiency in a broad range of content and practices. If the use of more specific descriptions were used, it would cause for the content to be atomized, in essence, it would create an assessment that more resembled a checklist of skills met, and this would not meet the intent of the common core content standards.

68 Understanding the Item Specifications
Next, you will see the Content Domain from the CCSS-M, also the content category for claim 1. Domain Domains, as found in the CCSS-M, are larger groups of related standards.

69 Understanding the Item Specifications
You will then see the grade specific standard(s) that is being assessed through the item. The standards define what students should understand and be able to do. Standard(s) Defines what students should understand and be able to do

70 Understanding the Item Specifications
Standards for Mathematical Practice “Describe varieties of expertise that mathematics educators at all levels should seek to develop in their students.” CCSS-M pp. 6-8 Make sense of problems & persevere in solving them. Reason abstractly and quantitatively. Next, the item specifications identifies the standards for mathematical practices that students will be engaging in when solving the item. In this item, students will be implementing mathematical practices 1 and 2 to solve the problem.

71 Understanding the Item Specifications
Depth of Knowledge The cognitive rigor that a student needs to bring to the item/task, as determined by the Cognitive Rigor Matrix, Math Content Specifications, Appendix C, p. 92 The depth of knowledge is identified. If you remember from part I of this webinar, the depth of knowledge is the cognitive rigor that a student needs to bring to an item or task. The DOK is determined by the cognitive rigor matrix.

72 Understanding the Item Specifications
Selected Response Includes computer-enhanced items Distractors are chosen to embody common misconceptions Designed to make sure that students do not obtain correct answers because of test-taking skills Next, the item type is identified. As stated earlier, this is a selected response item. Selected response items include computer-enhanced items, the distractors are chosen to embody common misconceptions, and selected response items are designed to make sure that students do not otain correct answers because of test-taking skills.

73 Understanding the Item Specifications
The range of difficulty is next; this item is of moderate difficulty. This is an estimation until the pilot assessment occurs. Range of Difficulty - Estimation until pilot assessment occurs.

74 Understanding the Item Specifications
Key – Correct Answer Next is the key with the correct answer for the item.

75 Now you try it: ACTIVITY
Select one or more items with attached blank answer sheet. In your teacher team use the content specifications document to determine the item specifications. Use the answer key to check your answers when you are finished.

76 Work as a team to determine critical specifications for each item
ACTIVITY – DOK 3 Work as a team to determine critical specifications for each item Use the standards and content specifications document to determine: …...the item type ….. the grade level ….. the content domain ….. the standard cluster heading ….. the assessment target(s) ….. the claim(s) ….. the mathematical practice(s) ….. the depth of knowledge

77 Resources Follow-Up Recording
What do you think? How does this knowledge inform your instruction? In a moment, we will unmute so you can share out.

78 Smarter Balanced Navigation of Website

79 Three locations for sample items
Practice and Pilot Tests Sample items and performance tasks Item Writing and Review Achievement Level Descriptors and College Content-Readiness Computer Adaptive Testing Technology Test Administration

80 Learning Targets specifics of the SBAC balanced assessment system.
I understand…. specifics of the SBAC balanced assessment system. how the Smarter Balanced Content Specifications reflect standards, assessment, and instruction ways to use knowledge of the new assessment design, tools and mathematics content to inform classroom instruction. how sample “items/tasks” can be used ‘thoughtfully’ to elicit evidence about student understanding and teacher instruction. formative assessment strategies that can be used during instruction. We differentiate Learning Targets from Success Criteria. (or what the learner will do.)

81 Success Criteria I will show understanding of the major components of the SBAC balanced assessment system and the vocabulary used in that system specifications.(DOK1) – Knowledge Target I can specify the assessment claim, target, standard, and depth of knowledge for an item or task . (DOK 2)- Skill target Explain response (DOK 3) I can use the concepts presented to design classroom instruction and assessments that elicit evidence of student learning, (DOK3) Reasoning, Product targets I can relate these concepts to other content areas and / or grade levels (DOK 4) Product target We want you to take responsibility for your learning. Clarification: Learning Targets are not intended to be deconstructed standards. They are statements of the focal points of each day’s lesson. Target – Assessment Method Match DOK 1- Knowledge Target: Selected Response: Matching, sorting, multiple choice DOK 2- Skill Target: Performance Task: DOK 3- (Explain) – Reasoning Target: (Written response) (Design) - Product Target: (Performance Task) DOK 4- (Synthesize) – product Target

82 Next Steps Become more familiar with the content in all of the documents discussed today. Begin using the documents and the identified implications for instruction when adjusting current activities, lessons, or units to meet the rigor of the ICS and make sure you have balance. Visit the websites provided to view sample lesson plans and formative assessment activities. Complete the post-webinar recording and activity.

83 This presentation can be found at…

84 Evaluation Please complete a survey on your experience in participating in Connecting the ICS-M and Smarter Balanced Assessment, Implications for Instruction Workshop Your feedback is greatly appreciated and is used to make adjustments in future trainings! To access the survey, please visit, Thank you for taking time out of your busy schedule to participate in today’s workshop!

85 Questions

86 Contact Information Nancy Thomas Price, Formative and Interim Assessment Coordinator Nichole Hall, Assessment Coordinator


Download ppt "Implications for Instruction"

Similar presentations


Ads by Google