Presentation is loading. Please wait.

Presentation is loading. Please wait.

PARCC Bias and Sensitivity Review

Similar presentations


Presentation on theme: "PARCC Bias and Sensitivity Review"— Presentation transcript:

1 PARCC Bias and Sensitivity Review
INTRODUCTION: Danielle Griswold, PARCC, Program Associate for Policy, Research, and Design PURPOSE: Familiarize yourself with the current PARCC policies for accessibility, accommodations, and fairness, including: (1) features for all students; (2) accessibility features identified in advance; (3) accommodations for students with disabilities; and (4) accommodations for English learners Learn how the new PARCC policies for access "stack up" to current state policies Identify major challenges and opportunities for implementing the accessibility and accommodations policies in classrooms Brainstorm three key messages around the implementation of the manual QUESTION: Raise your hand if you are an educator of students with special needs? AFT & NEA Item Review Boot Camp February 9, 2014

2 Confidential and Proprietary.
Points of Discussion Purpose of Bias and Sensitivity Reviews Identifying Bias and Stereotypes Sensitivity Awareness and Detecting Passage Bias Review for universal design and text complexity Guiding Questions and Sample Items Confidential and Proprietary.

3 Process Step 1: Item Writing
PARCC item developers were instructed to use the PARCC Accessibility Guidelines and the Item Development ITN language on writing accessible and universally designed items. Vendors each trained item writers on the guidelines and additional internal guidelines for content and bias & sensitivity. Step 2: Item Reviewing Formal process is being conducted for reviewing items at the Core Leadership, State Educator, and Bias & Sensitivity Committee levels Purpose is to ensure items have been written for content, accessibility, and bias & sensitivity 3

4 Purpose of Bias and Sensitivity Review
Review test materials for potential sources of bias and stereotypes. Apply professional test development standards to ensure materials are fair and not insensitive or offensive. Confidential and Proprietary.

5 Bias and Sensitivity Review Committee Charge
Your role is to provide expert review of passages for potential sources of bias and insensitivity Use the guidelines and samples provided in training to apply bias and sensitivity criteria to review of passages/texts and ensure that passages are not unfair or insensitive Remember: Content concerns are addressed by expert content review committee Important to focus exclusively on PARCC’s bias and sensitivity review criteria during your review Other concerns identified will be placed in “parking lot” for consideration by PARCC leadership The Bias and Sensitivity Training and the slides are a review for returning B/S reviewers; Trainer should acknowledge the greater experience of returning reviewers and invite their input and observations; Trainer should mention that each Bias and Sensitivity review room will have the “Typical Topics to Avoid” document (list of topics to avoid from page 2/23 of the ETS Guidelines for Fairness Review of Assessments (2009); The document need not be consulted for every single passage but is available for use when reviewers are uncomfortable with the topic being reviewed and need to see if it is included in this document.

6 Confidential and Proprietary.
What Is Bias? Language or content that prevents members of a group from demonstrating they possess the knowledge and skills being measured. Language or content that advantages members of a group in demonstrating they possess the knowledge and skills being measured. Confidential and Proprietary.

7 Common Forms of Bias and Stereotypes
Regional and geographic bias Gender and age stereotypes Ethnic, cultural, and religious stereotypes Socioeconomic and occupational stereotypes Confidential and Proprietary.

8 Confidential and Proprietary.
Guideline #1 Avoid Cognitive Sources of Construct-Irrelevant Variance Confidential and Proprietary.

9 Confidential and Proprietary.
Some Background Construct = knowledge, skill, or other attribute (KSA) you are trying to test Construct-relevant = related to KSA that you are trying to test Construct-irrelevant = KSA that is not related to what you are trying to test Variance = differences in test scores 9 Confidential and Proprietary.

10 Variance, Validity & Fairness
Construct-irrelevant variance lowers validity. Construct-irrelevant variance that has different effects across groups lowers fairness. Construct-irrelevant differences across groups decrease validity & fairness. Construct-relevant differences across groups are valid and, therefore, fair. Confidential and Proprietary.

11 Confidential and Proprietary.
Guideline #2 Avoid Affective Sources of Construct-Irrelevant Variance Confidential and Proprietary.

12 Sensitivity Awareness
Any reference or language in an item or passage that might cause a student to have an emotional reaction during the test administration can prevent a student from being able to accurately demonstrate ability. Note: The emotional factor is not limited to negative emotions, but can also include a “giggle factor” in items. reviewers should avoid overextending the guidelines to contrive situations in which an innocuous topic is judged to be unfair. That practice inappropriately limits test content because any topic can be judged to be potentially upsetting in some set of circumstances for some test takers. Confidential and Proprietary.

13 Confidential and Proprietary.
Guideline #3 Avoid Physical Sources of Construct-Irrelevant Variance The purpose of Guideline 3 is to help ensure that there are no unnecessary physical barriers in items or stimulus mate-rial (such as needlessly cluttered graphs) that may cause con-struct-irrelevant score variance, particularly for people with dis-abilities.17 Confidential and Proprietary.

14 Confidential and Proprietary.
Samples of Barriers Visual stimuli in the middle of paragraphs Decorative rather than informative illustrations Fonts that are hard to read Letters that look alike (e.g., O, Q) used as labels for dif-ferent things in the same item/stimulus Confidential and Proprietary.

15 How Might Item Bias be Detected?
Judgmental Procedure Bias and sensitivity review prior to field-testing Statistical Procedure DIF (Differential Item Functioning) analysis following field-tests and operational administrations Explain that other checks happen after this committee (they’re not the last line of defense)… flags alert us if a subgroup does not perform as expected Confidential and Proprietary.

16 Guiding Questions for Passages
Does the passage disadvantage any population (gender, race, ethnicity, language, religion, socioeconomic status, disability or geographic region) for non-educationally relevant reasons? Does the passage contain controversial or emotionally charged subject matter that is not supported by the Common Core State Standards? Is the passage potentially offensive, demeaning, insensitive, or negative toward any population? Does the passage depict any population in a stereotypical manner? Confidential and Proprietary.

17 What is Universal Design as Applied to Assessment?
Concept or philosophy that, when applied to assessments, provides all students with equal opportunities to demonstrate what they have learned. Considers the full range of participating students when developing items, tasks, and prompts that measure a desired construct Acknowledges differences among individuals and enables flexible adjustments for a broad range of students. Not one-size fits all. Purpose is to provide access for the greatest number of students during assessment, and minimize the need for individualized design or accommodations. Universal design when applied to assessment is analogous to universal design in architecture, where for example, ramps and curb cuts designed for people in wheelchairs are also considered essential for people without disabilities, such as parents pushing strollers or people moving heavy furniture. 17

18 Who is Intended to Benefit from Universally Designed Assessments?
ALL students benefit from assessments that are universally designed, including, but not limited to: Students who are gifted and talented English learners Students with physical, cognitive, or sensory disabilities Students with emotional or language/learning disabilities Students with more than one of these characteristics Students with unique linguistic needs Underperforming students Students without disabilities 18

19 Finding Out What Students Know and Can Do
By increasing access at the beginning through item writing and review, the assessments will allow participation from the widest possible range of students Testing results shouldn’t be affected by disability, gender, race, English language ability, etc. RESULT? Valid inferences about performance for all students who participate 19

20 Universal Design Item Review
Core Leadership Group, Bias & Sensitivity Review Committees, and Accessibility Reviewers will all review for components that make up universally designed assessments.

21 Accessibility Universal Design Review
Are the items and tasks amenable to accommodations? Are the items and tasks designed for maximum readability, comprehensibility, and legibility? Does the item or task material use a clear and accessible text format? When appropriate, does the item/task material use clear and accessible visual elements?

22 Accessibility Universal Design Review
Have all accessibility features been considered that may increase access while preserving the targeted construct? Have multiple means of presentation, expression, and engagement been considered with regard to the item/task? Have changes to the format been considered that do not alter the item/task meaning or difficulty?

23 Questions for Bias & Sensitivity Universal Design Review
Does the item take into consideration the diversity of the assessment population? Are the items sensitive and free of bias? Are instructions and procedures simple, clear, and intuitive?

24 Lessons Learned from Phase I Item Development for Accessibility
Items that rely on color If you print out the item in black and white….Are you still able to response to the test item? 2. The use of language for specific devices(tablets, PCs, etc.) Use the word “Select” versus “Click” Use of the word “Enter” versus “Type” Use the word “Select or Choose” versus “Highlight” Accessibility and accommodations study Include in preparatory materials explicit instructions on the test format. Emphasize in preparatory materials all item types and features that are new or unique to the test (e.g. multiple-select multiple-choice and text-flagging). Instruction in keyboarding skills prior to testing will be needed, especially for those with limited exposure to keyboarding. Gradually expand the variety and complexity of item types at successive grade levels. Provide more explicit instructions (for example, when items are connected or when math items require work to be shown). Students should receive adequate instruction in how to access information beyond the initial screen.

25 Lessons Learned from Phase I Item Development for Accessibility
3. Labels on geometric figures Avoid using o, i and l Avoid using the following letter combinations u and v p and q m and n Accessibility and accommodations study Include in preparatory materials explicit instructions on the test format. Emphasize in preparatory materials all item types and features that are new or unique to the test (e.g. multiple-select multiple-choice and text-flagging). Instruction in keyboarding skills prior to testing will be needed, especially for those with limited exposure to keyboarding. Gradually expand the variety and complexity of item types at successive grade levels. Provide more explicit instructions (for example, when items are connected or when math items require work to be shown). Students should receive adequate instruction in how to access information beyond the initial screen.

26 Lessons Learned from Phase I Item Development for Accessibility
3. The use of idioms for ELA Items should not use idioms or words that obscure what is being asked of students.  Items that use idioms or words that obscure what is being asked mail fail to meet UDL guidelines and will need to be rewritten.  The only exception to this is for those items that are designed to measure comprehension of idioms or where an item refers to a specific line from a text that might use an idiom.  In these exceptions where the idiom is part of the construct being measured, the idiom must remain in the item

27 Text Complexity Application
To support assessment item development To establish expectations for CCSS and NGSS standards relative to language complexity To support validation of assessments To provide information to explore how items function for students with differing levels of language proficiency To support adaptive item selection algorithm

28 Language Complexity Tool
What to Rate Prompts, items and directions (MC and CR items) For CR items the following is scored Prompt & directions * Satisfactory student response (during field testing) What to do with the rating? Record all 4 LC Tool scores as meta-data © 2012 Board of Regents of the University of Wisconsin System.

29 Language Complexity Tool
Go over highlighted sections in each row © 2012 Board of Regents of the University of Wisconsin System.

30 Language Complexity Tool Scores
Text Density Info Density Length Form & Structure Syntactic Complexity Vocabulary Lexical Complexity Here’s how the LC score is calculated Record all 4 ratings

31 Let’s look at some sample passages and items.
How To Proceed Let’s look at some sample passages and items.

32 Confidential and Proprietary.
Questions? Confidential and Proprietary.

33 Confidential and Proprietary.
How To Proceed Read through a specific set of test materials, keeping in mind the guiding questions. Do not review these materials from the point of view of a content expert, instead review them only from the perspective of fairness. If possible sources of bias, stereotypes, or sensitivity are detected, note them in your comments, and then they will be discussed by the group. The recorder will take complete and precise notes for the group. Confidential and Proprietary.


Download ppt "PARCC Bias and Sensitivity Review"

Similar presentations


Ads by Google