Standards, Structure, & Field Testing of ACCESS for ELLs™

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

1 A B C
Scenario: EOT/EOT-R/COT Resident admitted March 10th Admitted for PT and OT following knee replacement for patient with CHF, COPD, shortness of breath.
Variations of the Turing Machine
AP STUDY SESSION 2.
1
Select from the most commonly used minutes below.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
World-Class Instructional Design and Assessment
1 Michigan Merit Examination (MME) Spring 2013 MME Accommodations Briefing September 20, 2012.
ASTM Member Website Tools Jeff Adkins Diane Trinsey 1 September 2012 Officers Training Workshop.
September 2013 ASTM Officers Training Workshop September 2013 ASTM Officers Training Workshop ASTM Member Website Tools September 2013 ASTM Officers Training.
David Burdett May 11, 2004 Package Binding for WS CDL.
Administering the ACCESS for ELLs® Listening, Reading, and Writing Tests In this training module participants will receive a comprehensive orientation.
ACCESS for ELLs® Test Facilitator Training Workshop Overview
Administering the Kindergarten ACCESS for ELLs ® Emily Evans, Center for Applied Linguistics January 2007 New Jersey Department of Education Developed.
Teaching and Learning Science & Assessment Informational Webinars Presenter: Linda Cabe Smith, Science Assessment Specialist Ellen Ebert, Science Director,Teaching.
Amelia Courts, WESTELL Putting it All Together WV Department of Education.
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
1 What Is The Next Step? - A review of the alignment results Liru Zhang, Katia Forêt & Darlene Bolig Delaware Department of Education 2004 CCSSO Large-Scale.
Understanding the ACCESS for ELLs®
Local Customization Chapter 2. Local Customization 2-2 Objectives Customization Considerations Types of Data Elements Location for Locally Defined Data.
1 Illinois Alternate Assessment Directors Conference July 31, 2008.
Process a Customer Chapter 2. Process a Customer 2-2 Objectives Understand what defines a Customer Learn how to check for an existing Customer Learn how.
CALENDAR.
ACCESS for ELLs® Training
SBA to GLE: The Road Les Morse, Director Assessment & Accountability Alaska Department of Education & Early Development No Child Left Behind Winter Conference.
Alaskas English Language Proficiency Standards 2005 Alaska Department of Education & Early Development February 8, 2006.
© 2011 Board of Regents of the University of Wisconsin System, on behalf of the WIDA Consortium Introduction to the WIDA Consortium Jesse Markow.
Jesse Markow, Director of Communications and Business Development
Utilizing the 2012 WIDA ELD Standards to Support EL Achievement
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
The 5S numbers game..
Break Time Remaining 10:00.
Turing Machines.
Table 12.1: Cash Flows to a Cash and Carry Trading Strategy.
PP Test Review Sections 6-1 to 6-6
Improving Practitioner Assessment Participation Decisions for English Language Learners with Disabilities Laurene Christensen, Ph.D. Linda Goldstone, M.S.
Exarte Bezoek aan de Mediacampus Bachelor in de grafische en digitale media April 2014.
BEEF & VEAL MARKET SITUATION "Single CMO" Management Committee 22 November 2012.
TESOL International Convention Presentation- ESL Instruction: Developing Your Skills to Become a Master Conductor by Beth Clifton Crumpler by.
Middle School 8 period day. Rationale Low performing academic scores on Texas Assessment of Knowledge and Skills (TAKS) - specifically in mathematics.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Adding Up In Chunks.
SLP – Endless Possibilities What can SLP do for your school? Everything you need to know about SLP – past, present and future.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
GEtServices Services Training For Suppliers Requests/Proposals.
Center on Knowledge Translation for Disability and Rehabilitation Research Information Retrieval for International Disability and Rehabilitation Research.
Before Between After.
Subtraction: Adding UP
Equal or Not. Equal or Not
: 3 00.
5 minutes.
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
Prof.ir. Klaas H.J. Robers, 14 July Graduation: a process organised by YOU.
Speak Up for Safety Dr. Susan Strauss Harassment & Bullying Consultant November 9, 2012.
Essential Cell Biology
1 Phase III: Planning Action Developing Improvement Plans.
Converting a Fraction to %
Clock will move after 1 minute
PSSA Preparation.
Physics for Scientists & Engineers, 3rd Edition
Select a time to count down from the clock above
Copyright Tim Morris/St Stephen's School
1.step PMIT start + initial project data input Concept Concept.
Teacher Evaluation System LSKD Site Administrator Training August 6, 2014.
WIDA ACCESS Testing Information Session & Community Literacy Resources Parents as Educational Partners Tuesday, January 13, 2015 Jonathan Hudgens- WIDA.
Introduction to the WIDA Consortium
Presentation transcript:

Standards, Structure, & Field Testing of ACCESS for ELLs™ Jessica Motz Center for Applied Linguistics Washington, DC January 2005 Illinois 28th Annual Statewide Conference for Teachers Serving Linguistically and Culturally Diverse Students

Outline Background on the WIDA Project WIDA Standards for English Language Proficiency – language vs. content Structure of the ACCESS for ELLs™ Test Sample Items Pilot Test Results Field Testing Update Training for ACCESS for ELLs™ Other Issues/Questions/Discussion 2

Origins of the WIDA Consortium 2002: U.S. Dept. of Education Enhancement Grant Competition 3 Original States: Wisconsin Delaware Arkansas Early Additions: District of Columbia Rhode Island, Maine, New Hampshire, Vermont Later Additions: Illinois, Alabama Federal grant competition for states to meet requirements of Title III of NCLB, which says ELLs must be assessed annually for English proficiency growth in Listening, Speaking, Reading, and Writing. Acronym changing to World-class Instruction & Development of Assessments for ELLs. Ten states, representing approx. 275,000 ELLs 3

Multiple WIDA Components Language Proficiency English Language Proficiency Standards Large-Scale Assessment (ACCESS for ELLs) Classroom Instruction & Assessment Academic Content Alternate Assessment (Alternate ACCESS) Professional Development Standards Assessment Instruction Validation and Research ELP Classroom Standards are more performance-based (portfolios, etc.) Pending funding, Alternate ACCESS may be developed as an alternate test of academic content for ELLs at language proficiency levels 1 and 2. 4

WIDA Partners State Leadership Tim Boals, Wisconsin DPI, Chair Steering Committee Lead Developer (Standards and Project PI) Margo Gottlieb, Illinois Resource Center Item Specification Development Fred Davidson, UIUC Test Development Language Testing Division, CAL Professional Development Lorraine Valdez-Pierce, George Mason University Technical Applications (Database, Desire2Learn) University of Wisconsin - Oshkosh 5

Centrality of the ELP Standards Classroom Assessment Framework Large-Scale Assessment Framework English Language Proficiency Standards & Performance Definitions Refer to ‘red book.’ Standards address language proficiency, not content knowledge; the language of math, for example, not the math/computation itself. Model Performance Indicators: Classroom Model Performance Indicators: Large-Scale 6

Overall Organization of Standards Frameworks for Classroom & Large-Scale Assessment (2) English Language Proficiency Standards (5) Language Domains (4) Grade Level Clusters (4) Language Proficiency Levels (5) Model Performance Indicators Model PIs are the lowest level of expression of the standards In between Framework level (refer to previous slide) and PI level, there are several levels of organization to the standards. The organization of the standards is hierarchical. At the highest level, the standards are general statements about a broad range of communicative proficiency in a particular content area. As the standards drill down to the Model Performance Indicators they become much more specific about the particular kind of language proficiency being addressed. Across the 2 frameworks, there are over 800 PIs. 7

The WIDA ELP Standards Standard 1—SI English language learners communicate in English for social and instructional purposes in the school setting. Standard 2— LA English language learners communicate information, ideas and concepts necessary for academic success in the content area of Language Arts. Standard 3—MA English language learners communicate information, ideas and concepts necessary for academic success in the content area of Math. Standard 4—SC English language learners communicate information, ideas and concepts necessary for academic success in the content area of Science. Standard 5— SS English language learners communicate information, ideas and concepts necessary for academic success in the content area of Social Studies. Within each standard, there are PIs for Listening, Speaking, Reading, and Writing for each grade-level cluster (K-2, 3-5, 6-8, 9-12). 8

The Levels of English Language Proficiency 5 BRIDGING 4 EXPANDING 3 DEVELOPING 2 BEGINNING 1 Formerly LEP ENTERING 6 The five proficiency levels derive from Wisconsin’s scale and definitions. The labels used here were created by the WIDA development team. Never LEP 7 9

Criteria for Proficiency Level Definitions 1 2 3 4 5 ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING Comprehension and use of the technical language of the content areas Extent of language (text or discourse) control Development of phonological, syntactic, and semantic understanding or usage The criteria used to determine the proficiency level definitions are couched in terms of the language used in schools to impart content area information. Issues of linguistic complexity and semantic & pragmatic knowledge are brought to bear in formulating the definitions. At the two lower proficiency levels, it is assumed that ELLs would need extralinguistic support via graphic and visual aids in order to carry out language functions. This requirement also motivates the use of graphics for test items at these levels. It is upon these three criteria that the rubrics for Writing and Speaking are based. 10

Large-Scale Standards: SC Reading 11

Large-Scale Standards: SC Reading Classify living organisms (such as birds and mammals) by using pictures or icons 12

Large-Scale Standards: SC Reading Interpret data presented in text and tables in scientific studies 13

Large-Scale Standards: SC Reading 14

Assessment Forms Non-secure form for initial screening (July 1) One for each grade level cluster - with items at all 5 proficiency levels Kindergarten form - individually administered Secure forms for annual testing Two (initially: 100 and 200) for each grade level cluster Tier A: Proficiency levels 1-3 Tier B: Proficiency levels 2-4 Tier C: Proficiency levels 3-5 The range of proficiency levels indicated on each form (tier) represent the focal levels. It is possible, however, that for Tier B, a few items designated at levels 1 and 5 can also be included. 15

Tier Alignment with Proficiency Levels ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING 1 2 3 4 5 Annual ACCESS for ELLs Tier A Tier B Tier C The three tiers of the ACCESS for ELLs test will be calibrated to best serve ELLs at the boundaries indicated on the figure. It is expected that the majority of students will receive the Tier B form of the test. Tier A is intended for very low proficiency students and Tier C for students close to exiting from ELL status. 16

WIDA Steering Committee Administration Decisions Listening (15%): 20-25 minutes, machine scored Reading (35%): 35-40 minutes, machine scored Writing (35%): Up to 1 hour, rater scored Speaking (15%): Up to 15 minutes, administrator scored The administration times do not directly reflect the test component weights used to calculate the composite score. 17

Structure of ACCESS for ELLs Grade Level and Tier K 1-2 3-5 6-8 9-12 A (adaptive) A B C Domains Listening — group admin, machine scored Reading — group admin, machine scored Speaking — individual admin, adaptive, TA scored Writing — group admin, rater scored Forms 100 (roll-out Spring 2005) 999 (used to produce screener) 200 (roll-out Spring 2006) 18

Item Creation Process (Fall 2003 – present) Item specifications drafted Item writers assembled from nominated ESL teachers in consortium states Item writers trained using Blackboard distance-learning management software Item writers submitted items electronically Items reviewed by External Reviewers, also trained via Blackboard Items reviewed and revised internally & organized into themes… A primary concern in writing the items was to get representation from each of the states so the particulars of their academic content standards and approaches to curriculum would be represented. The distance learning vehicle provided the content delivery and communications tools necessary to effectively simulate face-to-face training delivery and also provided a means for sustaining and reinforcing training during the entire 2-month period of active item writing. 19

Item Creation Process, continued… (Fall 2003 – present) Thematic Folders of items arranged onto pilot test forms Forms piloted in 5 WIDA districts Pilot analysis & feedback incorporated; items created, revised by ESL teachers Thematic Folders of items arranged onto field test forms Forms field tested in 8 WIDA states Field test analysis occurring presently For more information on item development process: Jim Bauman’s session 20

Pilot Testing Results 21

Pilot Test Participation April – May 2004 5 Districts: Kenosha & Milwaukee, WI; Chicago & Cicero, IL; Washington, DC Approx. 1100 students in grades 1-12 22

Listening Test Multiple choice 20-25 minutes 23

Sample Items: Listening, Science 1-2 24

Science Listening 1-2 P1 PI: “identify objects according to chemical or physical properties from pictures and oral statements” SCRIPT: “A seed is small. Find the small seed.” 25

Science Listening 1-2 P2 PI: “match objects with their chemical or physical properties from pictures and oral statements” SCRIPT: “One day the seed will grow into something large, round, and heavy. Find what the seed grows into.” 26

Science Listening 1-2 P3 PI: “identify and group objects according to chemical or physical properties from oral statements” SCRIPT: “Seeds grow into plants. Find something else that grows.” 27

Percent Correct on Each Item (Grades 1-2: n = 173) P1scored P2scored P3scored 0.2 0.4 0.6 0.8 1 Mean 0.94 0.89 0.79 28

Percent Correct on Each Item (by Tier) 29

Percent Correct on Each Item (by Grade Level) From these data, item difficulties for the pilot test items were determined. Similar data on the field test items will be used to determine cut scores. 30

Average Listening Item Difficulty Across Grade Clusters (by Proficiency Levels) PLEVEL 5 4 3 2 1 Mean ITEMDIFF 1000 900 800 700 600 500 400 300 200 GRADELEV g12 g35 g68 g912 Examined average listening item difficulty across GL clusters. 6-8 items appeared to be overall more difficult at p3, for example, than some 9-12 items at p3. 31

Average Reading Item Difficulty by Grade Level Cluster across P-Levels 32

Reading Test Multiple choice 35-40 minutes 33

Examining Reading Items Across a Strand for the Different Tiers Language Arts, Reading, Grades 3-5 34

Sample Item (Field Test): Reading, Lang. Arts, Grades 3-5, Tier C 35

Items Tied to Performance Indicators: Tier C Items #11: “To Jessica, he was the best dog in the world.” This sentence shows Jessica’s opinion. Which of the following also shows an opinion? PI (p3): Identify language associated with stating opinions found in fiction or non-fiction text. #12: When Jessica saw the sign for the lost dog, why did she believe it was Blue? PI (p4): Differentiate between statements of fact and opinion found in various reading selections. #13: Why does the woman say at the end of the story, “He’s your dog, all right!”? PI (p5): Identify author’s reasons or intent for selecting facts or opinions found in fiction or non-fiction from grade-level language arts text. 36

Reading Items Adapted for Tier A Simpler text and more graphic support Items at proficiency levels 1, 2, and 3 For example: 1. Which is Blue? PI (p1): Match labels or identify facts from pictures and phrases. 2. “I know he has white spots.” Which words in this sentence tell you it is a fact? PI (p2): Identify language associated with stating facts found in short fiction or non-fiction text supported by pictures or graphics 3. [Same item from Tier C] “To Jessica, he was the best dog in the world.” This sentence shows Jessica’s opinion. Which of the following also shows an opinion? PI (p3): Identify language associated with stating opinions found in fiction or non-fiction text. 37

Tier Alignment with Proficiency Levels on Test Forms (L, R, W) ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING 1 2 3 4 5 Annual ACCESS for ELLs Tier A Tier B Tier C 38

Writing Test Up to 1 hour 4 tasks per tiered form: SI MA SC LA/SS Model provided to give background and structure for the task 39

Sample Item: Writing Grades 6-8 Lang. Arts Writing 6-8 P5 PI: “defend positions or stances using original ideas with supporting details” SSW 6-8 p5 PI: discuss which functions of the U.S. or other governments are most effective and why. 40

41

Speaking Test: Adaptive Format 42

Sample Item: Speaking 43

Task (Proficiency Level) 1 Example: Grades 3-5 First let’s talk about things people do outside. This is a picture of people in a park. I’m going to ask you some questions about this picture. Q1: (Point to TREE) What is this? Q2: (Point to BALL) What is this? Q3: (Point to DOG) What is this? Q4: (If necessary) What else do you see in this picture (OR) What other things do you see in this picture? SI Speaking 3-5: P1 “Respond to WH- questions” 44

Task (Proficiency Level) 2 Example: Now listen carefully. I’ve just asked you some questions about this picture. Now I want you to ask me some questions about it. (OR) Pretend you are the teacher and want to ask me some questions about this picture. For example, you could ask me, “Where are the people?” OK? Q1: (Point to BOY ON BIKE) What do you want to know about him? (OR) Ask me a question about him. Q2: (Point to PICNIC TABLE) What do you want to know about this? (OR) Ask me a question about this. Q3: What other things do you want to know about his picture? (OR) What’s another question you can ask me about (anything in) this picture? (Answer student’s question.) SI Speaking 35: P2 “Ask and respond to questions” 45

Task (Proficiency Level) 3 Example: Now let me tell you something about these children. (Point to CHILDREN PLAYING CATCH) Their names are Alex and Leticia. They like to play catch. Q1: Do you like to play catch? Q2: (If “Yes”) What else do you like to do? Q3: (If “No”) What do you like to do? Q4: What do you like about __________? (OR) Tell me something about ___________. Q5: (If necessary) Tell me more. SI Speaking 35: P3 “Exchange personal information” 46

Field Testing: Overview 47

Field Test Participation Request: Group-administered sections, 600 per form (minimum 400 per form) Individually-administered sections, 225 per form Includes 8 WIDA states Proportional representation of states (approx. 5.5%) Approx. 8700 students total (~3500 from Illinois) 48

Initial Field Test Analyses Concurrent calibration of MC items (Rasch) Separate scale construction for each domain Raw score to scale score on the screener NOTE: Steering Committee determined the following weights when a single level designation is needed: Writing 35% (e.g 2) Reading 35% (e.g. 2) Listening 15% (e.g. 5) Speaking 15% (e.g. 4) Composite = 2.75 (not yet 3) 49

Sample Feedback received to date Input from test administrators and coordinators: Test Length: The test is taking longer to administer than anticipated. Test Difficulty: The test is more difficult than anticipated. Grades 1-2 Test: Too challenging for fall first graders; these students should take K test. Grades 9-12 Test: There is not enough authentic literature on the Reading Test. CAL and DPI responded to these concerns via e-mail and the D2L (online training) discussion board. 50

Timeline for Development and Implementation Field Testing (incl. English speakers): September - January 2005 Analysis of Field Test Data: November 2004 - January 2005 Setting Cut Scores: January 2005 Spring Roll-Out of Form 100 (AL, ME, VT): February - May 2005 Practice Items for Fall Roll-out Available: March 15, 2005 Screener Available: July 2005 Fall Roll-Out of Form 100 (IL, DC): Fall 2005 51

Training for ACCESS for ELLs 52

Training for ACCESS August 2004: SEA WebEx conference on logistics and data needed for Field Test September 2004: LEA Coordinators trained online via Desire2Learn at UW – Oshkosh September – November 2004: Field Test Administrators trained online via D2L December 2004 – January 2005: Speaking Field Test Administrators trained in-person (DC, WI) March – May 2005: Operational Test Administrators giving spring roll-out test trained Format of the operational training still being worked out. 53

WIDA Field Test Administrator Workshop For Field Test training (Listening, Reading, Writing portions): 3-hour training Secure online site Readings, sound files (Listening test), discussion boards Assessment for certification as a TA 54

Operational ACCESS Test Administrator Training: Possible Scenarios Beginning March 2005 2-hour total online training for all sections of the test Additional, as-needed training on scoring the Speaking test, with online or live (state/district) trained facilitators. For more information and updates: www.wida.us 55

Other Issues/Questions/Discussion 56