C R E S S T / U C L A Impact of Linguistic Factors in Content-Based Assessment for ELL Students Jamal Abedi UCLA Graduate School of Education & Information.

Slides:



Advertisements
Similar presentations
National Accessible Reading Assessment Projects Examining Background Variables of Students with Disabilities that Affect Reading Jamal Abedi, CRESST/University.
Advertisements

What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
Assessment, Accountability and NCLB Education 388 Lecture March 15, 2007 Kenji Hakuta, Professor.
ORGANIZATION OF THE STANDARDS The Standards Matrix.
ELL-Language-based Accommodations for Content Area Assessments The University of Central Florida Cocoa Campus Jamal Abedi University of California, Davis.
TEA Student Assessment Division January These slides have been prepared by the Student Assessment Division of the Texas Education Agency. If any.
Surveys of Enacted Curriculum – English Language Learner Project Jacqueline Iribarren Abby Potter John Smithson Shelley Lee.
Jamal Abedi National Center for Research on Evaluation, Standards, and Student Testing UCLA Graduate School of Education & Information Studies November.
ELA and Writing Assessment: Impact on Transitioning to CCSS South Carolina Department of Education Office of Assessment.
STEM + English Learners = Keys to ELLevate SUCCESS! Tere Masiarchin CESA 6.
C R E S S T / U C L A Issues and problems in classification of students with limited English proficiency Jamal Abedi UCLA Graduate School of Education.
National Center on Educational Outcomes N C E O Strategies and Tools for Teaching English Language Learners with Disabilities April 9, 2005 Kristi Liu.
Are Accommodations Used for ELL Students Valid? Jamal Abedi University of California, Davis National Center for Research on Evaluation, Standards and Student.
National Center on Educational Outcomes What Item Level Data Tell Us About Universal Design: Fantasy, Foolishness, or Fuel for Fire? in Large- Scale Assessments.
Jamal Abedi University of California, Davis/CRESST Presented at The Race to the Top Assessment Program January 20, 2010 Washington, DC RACE TO THE TOP.
Getting Smarter About CAASPP. Overview of CAASPP  English Language Arts: Smarter Balanced Assessment  Math: Smarter Balanced Assessment  Science: California.
Computer in Education Jiaying Zhao CSE 610 Western Oregon University.
Computer in Education Jiaying Zhao CSE 610 Western Oregon University.
TEST ACCOMMODATIONS 2013 English Language Learners (ELLs) 1 Presented by: Leyda Sotolongo Title III Coordinator ESOL Department.
State of Texas Assessments of Academic Readiness (STAAR TM ) ELL ASSESSMENT UPDATE.
Assessment Accommodations for English Language Learners: Implications for Policy-Based Empirical Research By: Erin Burns.
Jamal Abedi University of California, Davis/CRESST Validity, Effectiveness and Feasibility of Accommodations for English Language Learners With Disabilities.
English Language Development Assessment (ELDA) Background to ELDA for Test Coordinator and Administrator Training Mike Fast, AIR CCSSO/LEP-SCASS March.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
The University of Central Florida Cocoa Campus
D E C E M B E R  S T U D E N T A S S E S S M E N T D I V I S I O N  T E X A S E D U C A T I O N A G E N C Y Training on the LPAC Decision-Making.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Assessment and Accountability Issues for English Language Learners and Students With Disabilities Oregon Department of Education October 4, 2007 Jamal.
Texas Comprehensive SEDL Austin, Texas March 16–17, 2009 Making Consistent Decisions About Accommodations for English Language Learners – Research.
CRESST ONR/NETC Meetings, July 2003, v1 1 ONR Advanced Distributed Learning Language Factors in the Assessment of English Language Learners Jamal.
School Test Coordinators Training Overview. Big Picture Objectives Understand the roles and responsibilities of school test coordinators Be able to support.
CONNECTICUT STATE DEPARTMENT OF EDUCATION State Board of Education Update on Student Performance First Analysis of Smarter Balanced Results September.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Linguistic Modification of Test Items Jamal Abedi University of California,
1/27 CRESST/UCLA Research findings on the impact of language factors on the assessment and instruction of English language Learners Jamal Abedi University.
November 2006 Copyright © 2006 Mississippi Department of Education 1 Where are We? Where do we want to be?
WKCE Translation Accommodation Annual Bilingual/ESL Meeting October 8, 2009.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Impact of Language Factors on the Reliability and Validity of Assessment.
Mark DeCandia Kentucky NAEP State Coordinator
NAEP 2011 Mathematics and Reading Results NAEP State Coordinator Mark DeCandia.
Emily Day.  The state of North Carolina requires that all students be assessed at the end of the school year (middle and high school) or at the end of.
Do we have enough evidence on the validity of accommodations to justify the reporting of accommodated assessments? Jamal Abedi University of California,
Further Research Baker, E., Goldschmidt, P., Martinez, F., & Swigert, S. (February, 2002). In search of school quality and accountability: Moving beyond.
Melrose High School 2014 MCAS Presentation October 6, 2014.
TELPAS & ELPS Connection to Instruction. Purposes of TELPAS To assess progress of LEP-exempt students To indicate when LEP exemptions are no longer necessary.
State Practices for Ensuring Meaningful ELL Participation in State Content Assessments Charlene Rivera and Lynn Shafer Willner GW-CEEE National Conference.
Alternate Proficiency Assessment Erin Lichtenwalner.
Jamal Abedi CRESST/University of California,Los Angeles Paper presented at 34 th Annual Conference on Large-Scale Assessment Boston, June 20-23, 2004.
C R E S S T / U C L A Psychometric Issues in the Assessment of English Language Learners Presented at the: CRESST 2002 Annual Conference Research Goes.
E L P A. ELPA Understand the definition and purpose of the English Language Proficiency Assessment Administer ELPA appropriately Objectives.
Chapter 3 Selection of Assessment Tools. Council of Exceptional Children’s Professional Standards All special educators should possess a common core of.
Jamal Abedi, UCLA/CRESST Major psychometric issues Research design issues How to address these issues Universal Design for Assessment: Theoretical Foundation.
Critical Issues Related to ELL Accommodations Designed for Content Area Assessments The University of Central Florida Cocoa Campus Jamal Abedi University.
C R E S S T / U C L A UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation,
School Test Coordinators Training Overview. 3/17/2016Free Template from 2 Understand the roles and responsibilities of school test.
TRAINING ON LINGUISTIC ACCOMMODATIONS FOR THE STAAR ® PROGRAM TEA Student Assessment Division 2015–2016.
How to avoid misclassification of English Language Learners with Disabilities Presented at: Supporting English Language Learners with Disability Symposium.
NCLB Assessment and Accountability Provisions: Issues for English-language Learners Diane August Center for Applied Linguistics.
Training on Linguistic Accommodations for the STAAR ® Program TEA Student Assessment Division.
Examples: Avoid Using Synonyms in Problems An issue that can create difficulties is to use a synonym for a word somewhere in the problem. Consider the.
Assessing ELLs with STAAR TM Texas Assessment Conference December 6, 2011  Silvia Alvarado-Bolek, Manager, STAAR Spanish Megan Galicia, Manager, Language.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
ELL-Focused Accommodations for Content Area Assessments: An Introduction The University of Central Florida Cocoa Campus Jamal Abedi University of California,
 Assessment and Score Reporting for SPRING 2014 will be aligned exclusively to the Common Core State Standards for Mathematics (CCSSM) (corestandards.org).
 The DPI provides a written translation accommodation for the paper/pencil WKCE for Science and Social Studies in grades 4, 8, and 10.  Wordlists and.
2016 CAASPP Results Overview and Reflection August
Assessing ELLs with STAARTM
Boston Tutoring Services: The Redesigned SAT
What do ELLs Need Differently in Math
E L P A Last updated: 08/31/09.
E L P A Last updated: 08/31/09.
Presentation transcript:

C R E S S T / U C L A Impact of Linguistic Factors in Content-Based Assessment for ELL Students Jamal Abedi UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation, Standards, and Student Testing Paper presented at the 2003 Annual Meeting of the American Educational Research Association Chicago April 2003

C R E S S T / U C L A Validity of Academic Achievement Measures We will focus on construct and content validity approaches: A test’s content validity involves the careful definition of the domain of behaviors to be measured by a test and the logical design of items to cover all the important areas of this domain (Allen & Yen, 1979, p. 96). A test’s construct validity is the degree to which it measures the theoretical construct or trait that it was designed to measure (Allen & Yen, 1979, p. 108). A content-based achievement test has construct validity if it measures the content that it is supposed to measure. A content-based achievement test has content validity if the test content is representative of the content being measured. Examples:

C R E S S T / U C L A Two major questions on the psychometrics of academic achievement tests for ELLs: 1. Are there any sources of measurement error that may specifically influence ELL performance? 2. Do achievement tests accurately measure ELLs’ content knowledge?

C R E S S T / U C L A l Familiarity/frequency of non-math vocabulary: unfamiliar or infrequent words changed census > video game l Length of nominals: long nominals shortened last year’s class vice president > vice president l Question phrases: complex question phrases changed to simple question words At which of the following times > When Linguistic Modification Concerns

C R E S S T / U C L A l Conditional clauses: conditionals either replaced with separate sentences or order of conditional and main clause changed If Lee delivers x newspapers > Lee delivers x newspapers l Relative clauses: relative clauses either removed or re-cast A report that contains 64 sheets of paper > He needs 64 sheets of paper for each report Linguistic Modification (continued) l Voice of verb phrase: passive verb forms changed to active The weights of 3 objects were compared > Sandra compared the weights of 3 rabbits

C R E S S T / U C L A CRESST Studies on the Assessment and Accommodation of ELL Students

C R E S S T / U C L A Analyses of extant data (Abedi, Lord, & Plummer, 1995) Used existing data from NAEP 1992 assessments in math and science. SAMPLE: Approximately 100,000 ELL and non-ELLs in grades 4, 8, and 12. NAEP test items were grouped into long and short items. Findings l ELL students performed significantly lower on the longer test items. l ELL students had higher proportions of omitted and/or not-reached items. l ELL students had higher scores on the less linguistically complex items.

C R E S S T / U C L A Interview study (Abedi, Lord, & Plummer, 1997) 37 students asked to express their preference between the original NAEP items and the linguistically modified version of these same items. Math test items were modified to reduce the level of linguistic complexity. Findings l Over 80% interviewed preferred the linguistically modified items over the original version.

C R E S S T / U C L A Impact of linguistic factors on students’ performance (Abedi, Lord, & Plummer, 1997) Two studies: testing performance and speed. SAMPLE: 1,031 grade 8 ELL and non-ELL students. 41 classes from 21 southern California schools. Findings l ELL students who received a linguistically modified version of the math test items performed significantly better than those receiving the original test items.

C R E S S T / U C L A The impact of different types of accommodations on students with limited English proficiency (Abedi, Lord, & Hofstetter, 1997) SAMPLE: 1,394 grade 8 students. 56 classes from 27 southern California schools. Findings Spanish translation of NAEP math test. l Spanish-speakers taking the Spanish translation version performed significantly lower than Spanish-speakers taking the English version. l We believe that this is due to the impact of language of instruction on assessment. Linguistic Modification l Contributed to improved performance on 49% of the items. Extra Time l Helped grade 8 ELL students on NAEP math tests. l Also aided non-ELL students. Limited potential as an assessment accommodation.

C R E S S T / U C L A Impact of selected background variables on students’ NAEP math performance. (Abedi, Hofstetter, & Lord, 1998) SAMPLE: 946 grade 8 ELL and non-ELL students. 38 classes from 19 southern California schools. Findings Four different accommodations used (linguistically modified, a glossary only, extra time only, and a glossary plus extra time). The glossary plus extra time was the most effective accommodation. Glossary plus extra time accommodation l Non-ELLs showed a greater improvement (16%) than the ELLs (13%). l This is the opposite of what is expected and casts doubt on the validity of this accommodation.

C R E S S T / U C L A The effects of accommodations on the assessment of LEP students in NAEP (Abedi, Lord, Kim, & Miyoshi, 2000) SAMPLE: 422 grade 8 ELL and non-ELL students. 17 science classes from 9 southern California schools. Findings Some forms of accommodations may help the recipients with the content of assessment. For example, a dictionary defines all the words in a test, both content and non-content. A Customized Dictionary l easier to use than published dictionary l included only non-content words in the test l ELL students showed significant improvement in performance l no impact on non-ELL performance

C R E S S T / U C L A Language accommodation for large-scale assessment in science (Abedi, Courtney, Leon, Mirocha, & Goldberg, 2001) SAMPLE: 612 grades 4 and 8 students. 25 classes from 14 southern California schools. Findings l A published dictionary was both ineffective and administratively difficult to implement as an accommodation.

C R E S S T / U C L A Language accommodation for large-scale assessment in science (Abedi, Courtney, & Leon, 2001) SAMPLE: 1,856 grade 4 and 1,512 grade 8 ELL and non-ELL students. 132 classes from 40 school sites in four cities, three states. Findings l Results suggested that linguistic modification of test items improved performance of ELLs in grade 8 l No change on the performance of non-ELLs with modified test l The validity of assessment was not compromised by the provision of an accommodation

C R E S S T / U C L A Impact of students’ language background on content-based performance: Analyses of extant data (Abedi & Leon, 1999) Analyses were performed on extant data, such as Stanford 9 and ITBS SAMPLE: Over 900,000 students from four different sites nationwide. Examining ELL and non-ELL student performance differences and their relationship to background factors (Abedi, Leon, & Mirocha, 2001) Data were analyzed for the language impact on assessment and accommodations of ELL students. SAMPLE: Over 700,000 students from four different sites nationwide. Findings l The higher the level of language demand of the test items, the higher the performance gap between ELL and non-ELL students. l Large performance gap between ELL and non-ELL students on reading, science and math problem solving (about 15 NCE score points). l This performance gap was reduced to zero in math computation.

C R E S S T / U C L A Normal Curve Equivalent Means and Standard Deviations for Students in Grades 10 and 11, Site 3 School District Reading Science Math MSD M SD M SD Grade 10 SD only LEP only LEP & SD Non-LEP & SD All students Grade 11 SD only LEP only LEP & SD Non-LEP & SD All students Note. LEP = limited English proficient. SD = students with disabilities.

C R E S S T / U C L A Disparity Index (DI) is an index of performance differences between LEP and non-LEP. Site 3 Disparity Index (DI) Non-LEP/Non-SD Students Compared to LEP-Only Students Disparity Index (DI) Math Math Grade Reading Math Total Calculation Analytical

C R E S S T / U C L A