Presentation is loading. Please wait.

Presentation is loading. Please wait.

Demystifying the Assessments

Similar presentations


Presentation on theme: "Demystifying the Assessments"— Presentation transcript:

1 Demystifying the Assessments
AIMS to PARCC: Demystifying the Assessments

2 Presenters: Wendi Anderson, Director for PARCC and Innovative assessment Sarah Gardner, ELA/Literacy Educational Specialist Jessica Eilertson, Mathematics Educational Specialist All of us

3 Purpose of Webinar: To share information about the transition from our current AIMS Assessments to the upcoming PARCC Assessments. Wendi

4 AIMS: Educator Involvement
Wendi Talking Points: Arizona educators have been an integral part of the AIMS program. From the creation of the blueprints—through item development—through reviewing passages and items for content and bias—through evaluating the items’ data—through selecting which items appear on the assessments—the state’s educators have leant their expertise to develop the tests that are called Arizona’s Instrument to Measure Standards.

5 2004/update 2005 Science Standards 2008 Mathematics Standard
Standards Addressed 2003 Reading Standards 2004 Writing Standards 2004/update 2005 Science Standards 2008 Mathematics Standard Sarah Talking Points: For the Spring 2013, Fall 2013 (HS only), Spring 2014, and Fall 2014 (HS only), the following standards are addressed: 2003 Reading Standards 2004 Writing Standards 2004/update 2005 Science Standards 2008 Mathematics Standards Additionally, items developed in the past two years have been dual-aligned to common core standards in reading and math.

6 AIMS Test Development Cycle
Standards and Test Blueprint Developed Item Development Item Review Field Test, Data Analysis, Selection of Items Final Review Assessment Given and Scored, Results Shared Sarah Talking Points: Standards and Test Blueprint Developed: Educators are selected from all over Arizona to serve on content standards development and AIMS committees. Grade-level standards are created by Arizona educators and adopted by the State Board of Education. Arizona educators develop AIMS test blueprints. Item Specifications for AIMS assessments are created by Arizona educators to provide appropriate, consistent interpretation of the content standards. Item Development: The test company commissions reading passages, and Arizona educators review the passages for content, bias, and cultural sensitivity. Arizona educators create prompts for the writing assessments and develop items for reading, mathematics, and science assessments. The test company performs the first review of all prompts and items for content, bias, and cultural sensitivity. Item Review: Arizona educators review prompts and items for proper alignment to the standards, appropriate and accurate content, and bias and cultural sensitivity. Arizona educators determine which prompts and items to field test. The test company along with the Arizona Department of Education Assessment Section perform the second review of all field test prompts and items for alignment, content, bias, and cultural sensitivity. Field Test, Data Analysis, Selection of Items: The test company’s content editors review selected field test prompts and items. Arizona educators analyze item data and select operational items for the next AIMS administrations. The Arizona Department of Education reviews the data from previously field tested writing prompts and selects the operational prompts to be used on the next AIMS administrations. Final Review: The test company’s research department (psychometrics) reviews the statistics of the prompts and items selected to ensure the quality, validity, and reliability of all AIMS assessments. The test company arranges items for three rounds of reviews and eProofs/Bluelines. The ADE performs Round Reviews and approved the final eProofs/Bluelines. Assessment Given and Scored, Results Shared: Assessments are printed, ordered, and sent to districts and charter organizations. The AIMS assessments are taken by students throughout Arizona and upon completion are sent back to the test company. The test company scores the assessments and reports the results.

7 What Do We Do With The Data?
Check content and coding Check DOK levels Check for standards alignment Check Rasch value Check P-Value Check the PtBis of items Check Infit/Outfit Statistics Check Bias flags Check percent distributions Check the percent of omits Accept/Reject/Revise the item Jessica Talking Points: Once the field testing is complete, data from the items is gathered and then a committee is formed to review the items. First, committee members check to make sure the item’s coding is correct concerning both old and Common Core standards and that there are no issues with the content. At every committee meeting where items are reviewed and/or selected, a verification of the alignment to the standards is performed. The DOK (Depth of Knowledge) level listed for the item is checked to determine its accuracy. The Rasch value allows for the comparison of item difficulty across test forms. Acceptable Rasch values generally fall between -3.0 and It should be noted that student scores are based on the Rasch values. Committee members check the P-Value, which is the proportion of students who answered the item correctly. An item’s P-Value ranges from 0 to 1. High values, such as .95, tend to indicate very easy items; low values, such as .20, tend to indicate very difficult items. Acceptable P-Values generally fall between .30 and .90. Items lower than .25 or higher than .95 have to be reviewed prior to acceptance or rejection. The PtBis (point biserial) is the item-to-total test correlation. This shows how an individual item performs in relationship to the sum of the other operational items. The Infit/Outfit Statistics are indicators of how well the item measures or fits its statistical model. For adherence to the model, an item’s infit should be between 0.6 and The item’s outfit should be less than The item’s infit is sensitive to unexpected responses at or near the item’s calibrated level. The item’s outfit is sensitive to unexpected responses away from the item’s calibrated level. Students who have the same ability should have the same chance of getting an item correct regardless of gender, ethnicity, or other characteristics. When a subgroup of students does poorly on a question in comparison to other subgroups, it may indicate a bias. The bias indicator is known as DIF (Differential Item Functioning). The DIF flags are as follows: A indicates no significant difference; B indicates a moderate amount of difference; C indicates a significant difference. Percent distribution is the percent of students who selected each answer choice. Percent of omits refers to the percentage of students who didn’t answer the item. This percentage should be less than 2%. After checking all data, the item is either accepted, rejected, or accepted with revisions. This statistical information is also used when putting the tests together. In October 2004, the first item selection by Arizona educators was performed. The committee reviewed and selected AIMS items for the Spring 2005 assessments – the first “dual-purpose” AIMS tests. An Item Selection spreadsheet is used for each domain and grade level. This spreadsheet indicates the test blueprint requirements by showing which Strands and Concepts need to be tested, the anchor items used from year to year for equating, the NRT items for Grades 3-8, and the open positions for other operational items. Each year since 2004, Item Selection Committees have verified the coding of items, reviewed the data points that are presented in this slide for each item, and selected appropriate items that met specific criteria for the overall test. The test vendor’s psychometric staff and the ADE Assessment Section review and approve (or make slight adjustment to) the recommendations of the educator committee.

8 Changes in AIMS: Many AIMS ELA and Mathematics items are aligned to Arizona’s Common Core Standards. Passages have been written with greater text complexity and higher Lexile levels. Items that that were selected have increased DOK levels. Public domain/primary source passages and items that address multiple Performance Objectives within a concept will be field tested in 2013. Wendi Talking Points: During the past year or two, changes have been made to the AIMS assessment. These changes include aligning new AIMS ELA and mathematics items to Arizona’s Common Core Standards, commissioning passages with greater text complexity, writing and selecting items with increased depth of knowledge levels, incorporating public domain and primary source passages, and writing items that address multiple performance objectives within a concept, which will be field tested in 2013.

9 Future of AIMS Will the state of Arizona continue to administer the AIMS test past 2014? Wendi Talking Points: Yes and no. AIMS will NOT be given after 2014 for grades 3-8; however, it is possible that AIMS may be given to certain cohorts of high school students in and possibly beyond. If students start with AIMS in 10th grade, they would need to have the opportunity to retake the AIMS test, for example. While no decision has been made regarding high school transition issues, Arizona’s Board of Education is reviewing the options and will be making decisions in the near future.

10 Complexity vs. Difficulty
KAREN HESS VIDEO: Sarah Talking Points: Video link: Complexity and difficulty describe different mental operations. Complexity describes the thought process that the brain uses to deal with information. This ties with Bloom’s revised Taxonomy. For example, you could ask a student, “What is the capital of Washington?” which is at the remember level, or you could ask a student, “What IS a state capital? Use your own words in your response,” which is at the understand level. The second question is more complex than the first. Difficulty refers to the amount of effort that the learner must expend within a level of complexity to accomplish a learning objective. It is possible for a learning activity to become increasingly difficult without becoming more complex. For example, “Name the states of the Union” is at the remember level because it involves simple recall for most students. The task “Name the states of the Union and their capitals” is also at the remember level but is more difficult. “Name the states and their capitals in order of their admission to the Union” is still at the remember level, but is considerably more difficult than the first two.

11 Complexity vs. Difficulty
EVALUATION Level of Thought SYNTHESIS ANALYSIS APPLICATION COMPREHENSION Wendi KNOWLEDGE Amount of Effort DIFFICULTY

12 Webb’s DOK Levels Sarah Talking Points:
RECALL OF INFORMATION Level 2 BASIC REASONING Level 3 COMPLEX REASONING Level 4 EXTENDED REASONING Sarah Talking Points: Webb’s DOK Levels are also tied to complexity, not difficulty. Level 1: Requires students to use simple skills or abilities to recall or locate facts from the text Focus is on basic initial comprehension; items require only a literal understanding of text presented Examples: verbatim recall from text; simple understanding of single word or phrase Level 2: Requires both initial comprehension and subsequent processing of text or portions of text Important concepts are covered, but not in a complex way; literal main ideas are stressed Examples: determine fact vs. opinion; summarize; paraphrase; interpret Level 3: Requires deep knowledge—students asked to go beyond the text and explain, generalize and connect ideas. Students must be able to support their thinking, citing evidence from the text or other sources. Examples: abstract theme identification, inferences between/among passages, application of prior knowledge, text support for analytical judgment about text. Level 4: Requires complex reasoning, planning, developing, thinking, often over an extended period of time Examples: comparing multiple works by the same author or from the same time period; evaluate relevancy and accuracy of information from multiple sources; gather, analyze, organize and interpret information from multiple sources

13 Hess Cognitive Rigor Matrix
Webb’s DOK Bloom’s Taxonomy DOK LEVEL 1 Recall & Reproduction DOK LEVEL 2 Basic Skills & Concepts DOK LEVEL 3 Strategic Thinking & Reasoning DOK Level 4 Extended Thinking Remember -recall, locate basic facts, definitions, details, events Understand -Select appropriate words for use when intended meaning is clearly evident. -Specify, explain relationships -Summarize -Identify central ideas -Explain, generalize, or connect ideas using supporting evidence -Explain how concepts or ideas specifically relate to other content domains or content Apply -Use language structure or word relationships to determine meaning -Use context to identify word meanings -Obtain/interpret information using text features -Use concepts to solve non-routine problems -Devise an approach among many alternatives to research a novel problem Analyze -Identify the kind of information contained in a graphic, table, visual, etc. -Compare literary elements, facts, terms, events -Analyze organization & text structures -Analyze or interpret author’s craft to critique a text -Analyze multiple sources or texts -Analyze complex/abstract themes Evaluate -Cite evidence and develop a logical argument for conjectures based on one text or problem -Evaluate relevancy, accuracy & completeness of information across texts/sources Create -Brainstorm ideas, concepts, etc. related to a topic or concept -Generate hypotheses based on observations or prior knowledge -Develop a complex model for a situation -Develop an alternative solution -Synthesize information across multiple sources or texts; articulate new voice or perspective Talking Points: What Karen Hess did was combine Webb’s Depth of Knowledge with Bloom’s Taxonomy to create a chart that would assist teachers in increasing the rigor in their classrooms. The PARCC assessment fits well with this matrix because for ELA/Literacy there is a requirement to cite evidence for every item and task on the assessment. Both mathematics and ELA/literacy have a performance based assessment that is designed to measure those difficult to assess skills in DOK level four. All items on PARCC in ELA/literacy require citing evidence—which moves everything to higher DOK levels.

14 Common Core Key Shifts—ELA/Literacy
Building knowledge through content-rich nonfiction. Reading, writing and speaking grounded in evidence from text, both literary and informational. Regular practice with complex text and its academic language. Wendi Common Core first released six shifts, but combined them into just three. The key shifts for ELA/Literacy focus on content-rich nonfiction, using evidence from text, and engaging with complex text.

15 AIMS to PARCC Grade 3 ELA/literacy Standards
Standards Assessed by AIMS Standards Assessed by PARCC 3S1C6.PO4 Answer clarifying questions in order to comprehend text. 3.RI.1Ask and answer questions to demonstrate understanding of a text, referring explicitly to the text as the basis for the answers. 3.RI.2 Determine the main idea of a text; recount the key details and explain how they support the main idea. 3.RI.10 (Read and comprehend informational texts…) Wendi Talking Points: Notice that for the AIMS assessment, each item addresses ONE performance objective, in this case 3S1C6.PO4. The PARCC assessment items, on the other hand, address multiple standards. As was already mentioned, every item and task MUST be aligned to RI or RL.1 AND RI or RL.10, as well as to another standard. In this case, the item is aligned to 3.RI.1, 3.RI.2, AND 3.RI.10.

16 ELA/Literacy—3rd Grade Reading Items
AIMS PARCC Which of these is not a way to buy the Zoomster Deluxe? Go to the store Call the toll-free number Go on-line Mail order with payment Read all parts of the question before responding. Part A: What is one main idea of “How Animals Live?” There are many types of animals on the planet. Animals need water to live. There are many ways to sort different animals. Animals begin their life cycles in different forms. Part B: Which detail from the article best supports your answer to Part A? “Animals get oxygen from air or water.” “Animals can be grouped by their traits.” “Worms are invertebrates.” “All animals grow and change over time.” “Almost all animals need water, food, oxygen, and shelter to live.” Sarah Talking Points- For the AIMS item, students would need to read the text, but just at a recall level. They would need to find each of the distractors in the passage in order to eliminate them and find the answer. The PARCC item, on the other hand, asks students to not only identify a main idea from the passage, but also to provide textual evidence to support their answer. This requires students to think at a higher cognitive level and then support their answer with evidence from the text.

17 AIMS to PARCC Grade 3 ELA/literacy Standards
Standards Assessed by AIMS Standards Assessed by PARCC 3S3C2.PO4 Interpret information in functional documents (e.g., maps, schedules, pamphlets) for a specific purpose. 3.RI.1Ask and answer questions to demonstrate understanding of a text, referring explicitly to the text as the basis for the answers. 3.RI.3 Describe the relationship between a series of historical events, scientific ideas or concepts, or steps in technical procedures in a text, using language that pertains to time, sequence, and cause/effect. 3.RI.10 Read and comprehend informational texts… Sarah Again you can see that what is covered in an AIMS ELA item is limited to one performance objective, while several standards are covered in a PARCC ELA/literacy item.

18 ELA/Literacy—3rd Grade Reading Items
AIMS PARCC According to the passage, which of the following is true about the Zoomster 2000? It is slower than the Zoomster Deluxe. It is a new toy. It saves time and money. It works on rough ground. Drag the words from the word box into the correct locations on the graphic to show the life cycle of a butterfly as described in “How Animals Live.” Wendi This AIMS item requires students to analyze, but textual evidence is not required to back up the answer. The PARCC item requires students to complete the life cycle of a butterfly AS IT IS DESCRIBED in “How Animals Live.” There are a couple of ways to begin this cycle, and if students rely on what they know already, they may end up with the wrong answer. Instead, students will need to go back to the passage and carefully read to determine the order of the stages. This item is also TECHNOLOGY ENHANCED, which means it is interactive. The use of science texts such as the one used with this item aligns with key shift 1: Building knowledge through content-rich nonfiction as well as key shift 3: Regular practice with complex text and its academic language.

19 AIMS to PARCC Grade 5-6 ELA/literacy Standards
Standards Assessed by AIMS Standards Assessed by PARCC 5S3C1.PO1 (Strands 1 and 2 are also assessed) Write a narrative based on imagined or real events, observations, or memories that includes: Characters Setting Plot Sensory details Clear language Logical sequence of events 6.RL.1Cite textual evidence to support analysis of what the text says explicitly as well as inferences drawn from the text. 6.W.3 Write narratives to develop real or imagined experiences or events using effective technique, relevant descriptive details, and well-structured event sequence. 6.L.1-3 Command of conventions Sarah AIMS writing prompt addresses several strands and performance objectives. PARCC constructed response items do as well. The main difference here is the tie to text and the requirement to cite textual evidence in the written response.

20 ELA/Literacy—Writing
AIMS PARCC 5th grade Writing Prompt: Imagine one morning you wake up and look in the mirror to see a different reflection. You realize you have turned into the principal of your school. Write a story in which you describe what happens when you go to school that day. 6th grade Prose Constructed Response: In the passage, the author developed a strong character named Miyax. Think about Miyax and the details the author used to create that character. The passage ends with Miyax waiting for the black wolf to look at her. Write an original story to continue where the passage ended. In your story, be sure to use what you have learned about the character Miyax as you tell what happens to her next. Wendi Here is another example of the differences between AIMS and PARCC. The AIMS writing prompt is not tied to text in any way. The PARCC constructed response task requires students to identify the characteristics of a character and then use these characteristics to extend the story. This ties to key shift #2: Reading, writing and speaking grounded in evidence from text, both literary and informational.

21 Key Shifts—Mathematics
Focus strongly where the Standards focus Coherence: Think across grades, and link to major topics within grades Rigor: In major topics pursue conceptual understanding, procedural skill and fluency, and application with equal intensity. Jessica Again, Common Core originally had six shifts for mathematics, but they have since been combined into three. The key shifts in mathematics have to do with focus, coherence and rigor. Additionally, there is a focus on conceptual understanding.

22 AIMS to PARCC: Grade 3 Mathematics Standards
Standard Assessed by AIMS Standard(s) Assessed by PARCC 3S1C1.5 Express benchmark fractions as fair sharing, parts of a whole, or parts of a set. For Mathematical Content: 3.NF.1 Understand a fraction 1/b as the quantity formed by 1 part when a whole is partitioned into b equal parts; understand a fraction a/b as the quantity formed by a parts of size 1/b. For Mathematical Practice: MP.2 (reason abstractly and quantitatively) and MP.7 (Look for and make use of structure) Jessica Notice the difference between these two standards. Both center on fractions, but the Common Core Standard focuses on UNDERSTANDING the relationship of a fraction as part of a whole.

23 Mathematics—3rd Grade Items
AIMS PARCC Gloria and her 3 friends will share a pizza equally. Which fraction shows the portion of the pizza each person will receive? 1/4 1/3 3/8 3/4 Jessica Both the AIMS and the PARCC item cover fractions. In the PARCC example, however, students must have a deeper conceptual understanding of fractions in order to successfully complete the task. Additionally, students will have to explain why the two fractions in the PARCC item are equal. Since the PARCC item is not multiple choice, it would be difficult for students to guess the correct answer. There are 28 possible ways to drag 6 soybeans onto a grid of 8 squares, and all such responses are correct.

24 AIMS to PARCC: Grade 3 Mathematics Standards
Standard Assessed by AIMS Standard(s) Assessed by PARCC 3S1C1.6 Compare and order benchmark fractions For Mathematical Content: 3.NF.2 Understand a fraction as a number on the number line; represent fractions on a number line diagram. For Mathematical Practice: MP.7 (Look for and make use of structure) and MP.5 (Use appropriate tools strategically) Jessica Talking Points: These standards both deal with fractions, but the PARCC standard again focuses on understanding, as well as mathematical practice.

25 Mathematics—3rd Grade Items
AIMS PARCC Which fraction is greater than 1/2? 1/6 2/6 3/6 4/6 Jessica Talking Points: While the AIMS item requires students to compare and order fractions, the PARCC item requires students to understand fractions as numbers on a number line. It would be difficult with the PARCC item to guess the correct answer or use a choice elimination strategy. Students will have to understand a fraction and its relationship to whole numbers in order to answer the question correctly.

26 Create high-quality assessments
PARCC Goals Create high-quality assessments Build a pathway to college and career readiness for all students Support educators in the classroom Develop 21st century, technology-based assessments Wendi Talking points: Read goals and discuss.

27 Evidence Based Assessment
Sarah Talking points: The PARCC is claims or evidence designed. This means that before the test could be created, claims had to be made based on the Common Core Standards. The main claim, of course, is that students should be college and career ready when they leave high school. According to the ACT definition adopted by Common Core: College and career readiness is the acquisition of the knowledge and skills a student needs to enroll and succeed in credit-bearing, first year courses at a postsecondary institution (such as a two or four year college, trade school, or technical school) without the need for remediation. The national professional organization for CTE has drafted a definition for career readiness: “Career readiness involves three major skill areas: core academic skills and the ability to apply those skills to concrete situations in order to function in the workplace and in routine daily activities; em­ployability skills (such as critical thinking and responsibility) that are essential in any career area; and technical, job-specific skills related to a specific career pathway.” What is college and career ready?

28 CLAIMS DRIVING DESIGN: MATHEMATICS
to practices. Master Claim: On-Track for college and career readiness. Students solve grade-level/course-level problems in mathematics as set forth in the Standards for Mathematical Content with connections to the Standards for Mathematical Practice. Sub-Claim A: Students solve problems involving the major content for their grade level with connections to practices Sub-Claim B: Students solve problems involving the additional and supporting content for their grade level with connections to practices Sub-Claim C: Students express mathematical reasoning by constructing mathematical arguments and critiques Jessica Talking Points: The master claim, as always, is that students will be on-track for college and career readiness. PARCC Master Claim for Mathematics. The PARCC assessment is based on the master claim: “Students are on track or ready for college and careers.” In understanding the claim we need to consider both: The Common Core State Standards for Mathematical Practice The Common Core State Standards for Mathematical Content We will begin by investigating how the Mathematical Practices are embedded in the sub-claims. The second sentence of the Master claim states: Students solve grade-level/course-level problems in mathematics as set forth in the Standards for Mathematical Content with connections to the Standards for Mathematical Practice. For a short time, let’s focus on those bolded words that draw our attention to the Standards for Mathematical Practice. Extension (or explanation or elaboration) of the Master Claim is shown in the 5 statements called sub-claims. <*> The Practices are explicitly referred to in the first 2 sub-claims. Students will solve problems “with connections to practices.” The topic of major content and additional and supporting content will be discussed in the next session. <*>The 3rd sub-claim refers specifically to mathematical reasoning, which is the focus of Mathematical Practice 3 (Construct viable arguments and critique the reasoning of others). As students construct arguments, they must attend to precision with their use of vocabulary in the statements. The attention to precision is Mathematical Practice 6. <*>The 4th sub-claim specifically mentions “modeling practice,” the focus of Mathematical Practice 4 (Model with mathematics). When students engage in modeling to solve appropriately difficult problems, they are likely to engage in one or more of the remaining Mathematical Practices. <*>The last sub-claim is for grades 3-6 only and focuses on fluency, which can only be developed by engaging in many mathematical practices. We will examine fluency in a later session. Sub-Claim D: Students solve real world problems engaging particularly in the modeling practice Sub-Claim E: Students demonstrate fluency in areas set forth in the Standards for Content in grades 3-6

29 Overview of Mathematics Task Types
PARCC mathematics assessments will include three types of tasks: Task Type Description of Task Type I. Tasks assessing concepts, skills and procedures Balance of conceptual understanding, fluency, and application Can involve any or all mathematical practice standards Machine scorable including innovative, computer-based formats Will appear on the End of Year and Performance Based Assessment components II. Tasks assessing expressing mathematical reasoning Each task calls for written arguments / justifications, critique of reasoning, or precision in mathematical statements (MP.3, 6). Can involve other mathematical practice standards May include a mix of machine scored and hand scored responses Included on the Performance Based Assessment component III. Tasks assessing modeling / applications Each task calls for modeling/application in a real-world context or scenario (MP.4) Can involve other mathematical practice standards. Jessica Talking Points: There are three types of tasks that will be included on the PARCC summative math assessments. The first type will assess concepts, skills and procedures. These items will be more similar to AIMS type math questions than the other two. These types of items are machine scorable and will appear on both the Performance Based and End of Year Assessments. The second type will assess how well students can express mathematical reasoning. Students will write justifications and arguments, critique of reasoning or precision in mathematical statements. Some of these items will be hand scored responses and these task types will be included on the Performance Based Assessment. The third type assesses modeling and applications. These will tie in to a real-world context or scenario, and again some of these items will be hand scores. These task types will be included on the Performance Based Assessment.

30 PARCC Math Assessment Prototypes: Grade 7
Jessica This item is hyperlinked to the PARCC website, where you will be able to show participants the interactive nature of this type of item. While this presentation is in Slide Show mode, simply click on the item itself to access the website.

31 PARCC Math Assessment Prototypes: Grade 10
This item is hyperlinked to the PARCC website, where you will be able to show participants the interactive nature of this type of item. While this presentation is in Slide Show mode, simply click on the item itself to access the website.

32 Claims Driving Design: ELA/literacy
Students are on-track or ready for college and careers Students read and comprehend a range of sufficiently complex texts independently Students write effectively when using and/or analyzing sources. Students build and present knowledge through research and the integration, comparison, and synthesis of ideas. Convention and Knowledge of Language Talking Points: The master claim, again, is that students will be on-track for college and career readiness. The main claims for ELA/literacy are: Students read and comprehend a range of sufficiently complex texts independently. This includes reading literature and informational text and understanding vocabulary interpretation and use. Students write effectively when using and/or analyzing sources. This claim is divided into written expression and convention and knowledge of language. Students build and present knowledge through research and the integration, comparison, and synthesis of ideas. Reading Informational Text Vocabulary Interpretation and use Reading Literature Written Expression

33 PARCC Summative Assessment with EBSR, TECR, and PCR Items (ELA/Literacy)
Performance-Based Assessment Literary Analysis Task Narrative Task Research Simulation Task The Literature Task plays an important role in honing students’ ability to read complex text closely, a skill that research reveals as the most significant factor differentiating college-ready from non-college-ready readers. This task will ask students to carefully consider literature worthy of close study and compose an analytic essay. The Narrative Task broadens the way in which students may use this type of writing. Narrative writing can be used to convey experiences or events, real or imaginary. In this task, students may be asked to write a story, detail a scientific process, write a historical account of important figures, or to describe an account of events, scenes or objects, for example. The Research Simulation Task is an assessment component worthy of student preparation because it asks students to exercise the career- and college- readiness skills of observation, deduction, and proper use and evaluation of evidence across text types. In this task, students will analyze an informational topic presented through several articles or multimedia stimuli, the first text being an anchor text that introduces the topic. Students will engage with the texts by answering a series of questions and synthesizing information from multiple sources in order to write two analytic essays. End-Of-Year Assessment On the end-of-year assessment, students have the opportunity to demonstrate their ability to read and comprehend complex informational and literary texts. Questions will be sequenced in a way that they will draw students into deeper encounters with the texts and will result in more thorough comprehension of the concepts. Talking Points: There are three main items and three main tasks involved with the ELA/literacy PARCC Assessments. Three types of items include: EBSR—Evidence Based Selected Response. These will be included on both the End of Year Assessment and Performance Based Assessment. TECR—Technology Enhanced Constructed Response. These will be included on the Performance Based Assessment and possibly on the End of Year Assessment. PCR—Range of Prose Constructed Response. These will be included on the Performance Based Assessment. Task Types for the Performance Based Assessment: Literary Analysis Task—Students will write an analytic essay after careful reading of “literature worthy of close study.” Narrative Task—Students will write to convey experiences or events, but narrative is expanded for this task. Students may be asked to detail a scientific process, write a historical account of important figures or describe an account of events, scenes or objects. Research Simulation Task—Students will analyze an informational topic through articles and possibly multimedia stimuli. The anchor text will introduce the topic, and two others will follow. Students will engage with the texts by answering questions and synthesizing information in order to write two analytic essays.

34 PARCC ELA Assessment Prototypes: Grade 7
Prose Constructed Response to Research Simulation Task (Analytical Essay) Sarah The picture is hyperlinked to the PARCC website, where you will be able to show participants the interactive nature of this type of item. While this presentation is in Slide Show mode, simply click on the item itself to access the website.

35 PARCC ELA Assessment Prototypes: Grade 10
Prose Constructed Response – Literary Analysis Task Sarah The picture of the business man with wings is hyperlinked to the PARCC website, where you will be able to show participants the interactive nature of this type of item. While this presentation is in Slide Show mode, simply click on the item itself to access the website. .

36 Arizona Involvement in PARCC
Wendi

37 Support and Resources List of websites
Contact information: We need to add the varying support and resources that are or will be available to educators.


Download ppt "Demystifying the Assessments"

Similar presentations


Ads by Google