ELL-Language-based Accommodations for Content Area Assessments The University of Central Florida Cocoa Campus Jamal Abedi University of California, Davis.

Slides:



Advertisements
Similar presentations
Principal Investigators: Martha Thurlow & Deborah Dillon Introduction Assumptions & Research Questions Acknowledgments 1. What characteristics of current.
Advertisements

Assessing Limited English Proficient (LEP) Student Achievement Revised Participation Guidelines October 2009
Fairness in Testing: Introduction Suzanne Lane University of Pittsburgh Member, Management Committee for the JC on Revision of the 1999 Testing Standards.
Our English Language Learners! Accommodation Policy and Procedures January 28, 2011.
TEA Student Assessment Division January These slides have been prepared by the Student Assessment Division of the Texas Education Agency. If any.
Issues Related to Assessment with Diverse Populations
Middle School Math Instruction and Assessment for Students with LD, Students with MMR, & ELL Students: A Review of the Literature CCSSO Large-Scale Assessment.
Jamal Abedi National Center for Research on Evaluation, Standards, and Student Testing UCLA Graduate School of Education & Information Studies November.
Confidential and Proprietary. Copyright © 2010 Educational Testing Service. All rights reserved. Catherine Trapani Educational Testing Service ECOLT: October.
Assessment: Reliability, Validity, and Absence of bias
Cognitive Load Theory Sweller, van Merrienboer, and Paas, 1998 Psych 605 Fall 2009.
C R E S S T / U C L A Issues and problems in classification of students with limited English proficiency Jamal Abedi UCLA Graduate School of Education.
Testing 05 Test Methods. Considerations in Test Methods Like traits tested, test methods also affect test performance. Test methods: features of the test.
Are Accommodations Used for ELL Students Valid? Jamal Abedi University of California, Davis National Center for Research on Evaluation, Standards and Student.
National Center on Educational Outcomes What Item Level Data Tell Us About Universal Design: Fantasy, Foolishness, or Fuel for Fire? in Large- Scale Assessments.
Jamal Abedi University of California, Davis/CRESST Presented at The Race to the Top Assessment Program January 20, 2010 Washington, DC RACE TO THE TOP.
Jamal Abedi University of California, Davis/CRESST Presented at: The Race to the Top Assessment Program Public & Expert Input Meeting December 2, 2009.
National Center on Educational Outcomes NCEO Pre-conference Clinic Under the Big Top! Accommodating Assessments for ALL Students.
State of Texas Assessments of Academic Readiness (STAAR TM ) ELL ASSESSMENT UPDATE.
Assessment Accommodations for English Language Learners: Implications for Policy-Based Empirical Research By: Erin Burns.
Jamal Abedi University of California, Davis/CRESST Validity, Effectiveness and Feasibility of Accommodations for English Language Learners With Disabilities.
Making Consistent Decisions About Accommodations for English Language Learners – Research Summit – Texas Comprehensive SEDL Austin, Texas March.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
The University of Central Florida Cocoa Campus
1 REACH Performance Tasks SY14-15 Library Science June 16, 2014.
Assessment and Accountability Issues for English Language Learners and Students With Disabilities Oregon Department of Education October 4, 2007 Jamal.
Texas Comprehensive SEDL Austin, Texas March 16–17, 2009 Making Consistent Decisions About Accommodations for English Language Learners – Research.
CRESST ONR/NETC Meetings, July 2003, v1 1 ONR Advanced Distributed Learning Language Factors in the Assessment of English Language Learners Jamal.
C R E S S T / U C L A Impact of Linguistic Factors in Content-Based Assessment for ELL Students Jamal Abedi UCLA Graduate School of Education & Information.
Making Sense of Phrasal Verbs: A Case Study of EFL Learners in Taiwan Ying-hsueh Hu & Pei-Wen Luo Tamkang University English Department June 28, 2013 ICLC.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Linguistic Modification of Test Items Jamal Abedi University of California,
Riverside County Assessment Network CCSS SBAC Update.
1/27 CRESST/UCLA Research findings on the impact of language factors on the assessment and instruction of English language Learners Jamal Abedi University.
WKCE Translation Accommodation Annual Bilingual/ESL Meeting October 8, 2009.
Psychology 307: Cultural Psychology Lecture 3
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Impact of Language Factors on the Reliability and Validity of Assessment.
Do we have enough evidence on the validity of accommodations to justify the reporting of accommodated assessments? Jamal Abedi University of California,
Cultural Issues in Testing There is a great range along which cultural factors may affect psychological testing. They range from cultural issues which.
Variables It is very important in research to see variables, define them, and control or measure them.
Psychology 307: Cultural Psychology Lecture 3
1 Technical Communication A Reader-Centred Approach First Canadian Edition Paul V. Anderson Kerry Surman
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. English Language Learners Assessing.
Jamal Abedi CRESST/University of California,Los Angeles Paper presented at 34 th Annual Conference on Large-Scale Assessment Boston, June 20-23, 2004.
Accessibility and Accommodations Alabama Student Assessment Program Nannette Pence Education Specialist Student Assessment Alabama State Department of.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
Accessibility and Accommodations Alabama Student Assessment Program Nannette Pence Education Specialist Student Assessment Alabama State Department of.
C R E S S T / U C L A Psychometric Issues in the Assessment of English Language Learners Presented at the: CRESST 2002 Annual Conference Research Goes.
Jamal Abedi, UCLA/CRESST Major psychometric issues Research design issues How to address these issues Universal Design for Assessment: Theoretical Foundation.
Accommodations for ELLs Krum ISD Kinds of Accommodations for ELLs Affective Linguistic Cognitive 504 IEP.
Critical Issues Related to ELL Accommodations Designed for Content Area Assessments The University of Central Florida Cocoa Campus Jamal Abedi University.
C R E S S T / U C L A UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation,
How to avoid misclassification of English Language Learners with Disabilities Presented at: Supporting English Language Learners with Disability Symposium.
NCLB Assessment and Accountability Provisions: Issues for English-language Learners Diane August Center for Applied Linguistics.
Examples: Avoid Using Synonyms in Problems An issue that can create difficulties is to use a synonym for a word somewhere in the problem. Consider the.
Assessing ELLs with STAAR TM Texas Assessment Conference December 6, 2011  Silvia Alvarado-Bolek, Manager, STAAR Spanish Megan Galicia, Manager, Language.
ELL-Focused Accommodations for Content Area Assessments: An Introduction The University of Central Florida Cocoa Campus Jamal Abedi University of California,
ELL Update: From TAKS to STAAR TM Bilingual/ESL TETN, Event #8251 February 11, 2011 Laura Ayala ELL Assessment Director Student Assessment Division Texas.
Martha Thurlow, Phd Linda Goldstone, MS APRIL 22, 2017
Assessing ELLs with STAARTM
Translation of Research into Practice
What is a CAT? What is a CAT?.
Laurene Christensen, Ph.D. Linda Goldstone, M.S.
Common Core Update May 15, 2013.
CEM Primary Overview.
ابزار گرد آوری داده ها 1- پرسشنامه 2- مشاهده 3- مصاحبه
What do ELLs Need Differently in Math
Cognitive Load Theory Sweller, van Merrienboer, and Paas, 1998
Spanish and English Neuropsychological Assessment Scales - Guiding Principles and Evolution Friday Harbor Psychometrics Workshop 2005.
INCREASING EAL NURSING STUDENTS’ COMPREHENSION OF MULTIPLE-CHOICE TEST ITEMS THROUGH LINGUISTIC MODIFICATION MARILN Fall Conference Sutton, MA October.
English Through Content
Presentation transcript:

ELL-Language-based Accommodations for Content Area Assessments The University of Central Florida Cocoa Campus Jamal Abedi University of California, Davis July 7,

ELL Language-Based Accommodations  English dictionary  English glossary  Bilingual dictionary/glossary  Customized Dictionary  Native language testing  Read-aloud test items or directions  Linguistically modified test  Computer testing with pop-up glossaries 2

English Dictionary  Providing an English Dictionary is another commonly used accommodation for ELL students (Abedi, Courtney, & Leon, 2003; Abedi, Lord, Boscardin & Miyoshi, 2000).  The use of a dictionary and extra time affect the performance of all students (Abedi, Lord, Hofstetter & Baker, 2000; Hafner, 2001; Maihoff, 2002; Thurlow, 2001; Thurlow & Liu, 2001).  By gaining access to definition of content-related terms, recipients of a dictionary may be advantaged over those who did not have access to the dictionaries. This may compromise the validity of assessment (Abedi, Courtney, Mirocha, Leon & Goldberg, 2005).  The dictionary as a form of accommodation suffers from yet another major limitation, the feasibility issue (Abedi, Courtney, Mirocha, Leon & Goldberg, 2001).  Consequently, the results of accommodated and non- accommodated assessment may not be aggregated. 3

English Glossary  English glossary with extra time raised performance of both ELL and non-ELL students (Abedi, Hofstetter & Lord, 2000).  ELL students’ performance increased by 13% when they were tested under the glossary with extra time accommodation.  While this looks promising, it does not present the entire picture.  Non-ELL students also benefited from this accommodation, with an increase of 16% (Abedi, Hofstetter & Lord1998, 2000).  Thus, the results of the accommodated outcome cannot be aggregated with the non-accommodated outcome. 4

Customized Dictionary  Customized dictionary was introduced as a more valid alternative to English/bilingual dictionaries (Abedi, Courtney, Mirocha, Leon & Goldberg, 2001).  It is a cut-and-paste of the actual dictionaries.  In only includes terms that are: (1) in the test and (2) non- content related.  Results of studies suggest that it is highly effective and valid accommodation for ELL students. 5

Linguistically Modified Test  There are, however, some accommodations that help ELL students with their English language needs without compromising the validity of assessment.  Studies suggested that the linguistically modified version of the tests items is an effective and valid accommodation for ELL students (Abedi, Hofstetter, Lord & Baker, 2000; Maihoff, 2002; Rivera & Stansfield, 2001).  This accommodation also helped students with learning disabilities.  Thus, an accommodation may have the potential to be effective and help provide valid assessment outcomes for ELL students. 6

Computer Testing  Research findings suggest computer testing as an effective and valid accommodation for ELL students (Abedi, Courtney, Mirocha, Leon, & Goldberg, 2001).  ELL students show higher levels of motivation on the assessments administered by computer.  Different types of accommodations that have been shown to be useful for ELL students may be incorporated into the computer testing system (Abedi, et al, 2011). 7

Native Language Testing  Translating tests into students’ native language is an accommodation used by many states across the country (Abedi, Lord, Hofstetter & Baker, 2000; Rivera, Stansfield, Scialdone & Sharkey, 2000).  Issues concerning translation and content coverage across the forms must be seriously considered.  Students’ background variables, particularly their level of proficiency in L1 and L2, must be studied before considering this accommodation.  Using native language assessment may not produce desirable results if the language of instruction and assessment are not aligned (Abedi, Hofstetter & Lord, 2004). 8

Examining Complex Linguistic Features in Content-Based Test Items Unnecessary complex linguistic features slow down the reader, make misinterpretation more likely, and add to the reader’s cognitive load; thus interfering with concurrent tasks. These features include: Concrete vs. abstract or impersonal presentations Item length Unfamiliar Vocabulary Nominal heaviness Relative clause Conditional clause Passive voice Long noun phrases Subordinate clauses 9

Impact of language factors on content assessments for ELLs Continuum of Linguistic Complexity and Item Characteristics 1 Items with no linguistic complexity: Familiar or frequently used words; word length generally shorter Short sentences and limited prepositional phrases Concrete item(s) and a narrative structure No complex conditional or adverbial clauses No passive voice or abstract or impersonal presentations 10

Impact of language factors on content assessments for ELLs Continuum of Linguistic Complexity and Item Characteristics 2 Items with a minimal level of linguistic complexity: Familiar or frequently used words; short word length Moderate sentence length with a few prepositional phrases Concrete item(s) No subordinate, conditional, or adverbial clauses No passive voice or abstract or impersonal presentations 11

Impact of language factors on content assessments for ELLs Continuum of Linguistic Complexity and Item Characteristics 3 Items with a moderate level of linguistic complexity: Unfamiliar or seldom used words Long sentence (s) Abstract concept (s) Complex sentence/conditional tense/adverbial clause(s) A few passive voice or abstract or impersonal presentations 12

Impact of language factors on content assessments for ELLs Continuum of Linguistic Complexity and Item Characteristics 4 Items with a high level of linguistic complexity: Relatively unfamiliar or seldom used words Long or complex sentence(s) Abstract concept(s) Difficult subordinate, conditional, or adverbial clause(s) Passive voice/ abstract or impersonal presentations 13

Impact of language factors on content assessments for ELLs Continuum of Linguistic Complexity and Item Characteristics 5 Items with a maximum level of linguistic complexity: Highly unfamiliar or seldom used words Very Long or complex sentence(s) Abstract concept(s) Very difficult subordinate, conditional, or adverbial clause(s) Many passive voice/ abstract or impersonal presentations 14

Sample Original and Revised Item Below is an example of a test items that deemed to be linguistically complex and a linguistically modified version of the items.Below is an example of a test items that deemed to be linguistically complex and a linguistically modified version of the items. Original: If Y represents the number of newspapers that Lee delivers each day, which of the following represents the total number of newspapers that Lee delivers in 5 days?Original: If Y represents the number of newspapers that Lee delivers each day, which of the following represents the total number of newspapers that Lee delivers in 5 days? –A) 5 + Y –B) 5 x Y –C) Y + 5 –D) (Y + Y) x 5 Modified: Lee delivers Y newspapers each day. How many newspapers does he deliver in 5 days?Modified: Lee delivers Y newspapers each day. How many newspapers does he deliver in 5 days? (Adopted from Abedi, Lord & Plummer, 1997, p. 21) 15

Linguistic Modifications made on the item Conditional clause changed to separate sentence Two relative clauses removed and recast Long nominals shortened Question phrase changed from “which of the following represents“ to “how many” Item length changed from 26 to 13 words Average sentence length changed from 26 to 6.5 words Number of clauses changed from 4 to 2 Average number of clauses per sentence changed from 4 to 1 16

Conclusions and Recommendation Assessments and instructions for ELLs : Must be based on a sound psychometric principles Must be controlled for all sources of nuisance or confounding variables Must be free of unnecessary linguistic complexities Must include sufficient number of ELLs and SWDs in its development process (field testing, standard setting, etc.) Must be free of biases, such as cultural biases Must be sensitive to students’ linguistics and cultural needs 17

Conclusions and Recommendations Accommodations: Must be relevant in addressing assessment issues for ELL students Must be effective in making assessments more accessible to ELL students Should not alter the construct being measured Must provide results that can be aggregated with the assessment outcomes under standard conditions Must be feasible in national and state assessments 18

Conclusions and Recommendations Examples of research-supported accommodations: Providing a customized dictionary is a viable alternative to providing traditional dictionaries. The linguistic modification of test items that reduce unnecessary linguistic burdens on students is among the accommodations that help ELL students without affecting the validity of assessments. Computer testing with added extra time and glossary was shown to be a very effective, yet valid accommodation (Abedi, Courtney, Leon and Goldberg, 2003) 19