Greg Lobdell, President & Director Research In Collaboration with the State Board of Education: Ben Rarick Andrew Parr, and Linda.

Slides:



Advertisements
Similar presentations
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Advertisements

Franklin Public Schools MCAS Presentation November 27, 2012 Joyce Edwards Director of Instructional Services.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
Achievement Gaps in the SEDL Region: What Data Tell Us Closing the Achievement Gap: School Resources and Beyond SEDL Policy Forum 2004 September 27-28,
MEDIAN STUDENT GROWTH ANALYSIS December 2014 Center for Educational Effectiveness: Greg Lobdell Office of Student and School Success, OSPI: Sue Cohn.
Impacts of Evaluation Professional Growth and Student Achievement.
Nevada Transitioning from measuring status and reporting AYP, to measuring growth and reporting on School Performance.
Mark DeCandia Kentucky NAEP State Coordinator
GREG LOBDELL, PRESIDENT, CENTER FOR EDUCATIONAL EFFECTIVENESS BEN RARICK, STATE BOARD OF EDUCATION, EXECUTIVE DIRECTOR LINDA DRAKE, STATE BOARD OF EDUCATION.
Chronic Absence in Oregon Attendance Works The Children’s Institute The Chalkboard Project ECONorthwest.
Introduction to Adequate Yearly Progress (AYP) Michigan Department of Education Office of Psychometrics, Accountability, Research, & Evaluation Summer.
School Performance Index School Performance Index (SPI): A Comprehensive Measurement System for All Schools Student Achievement (e.g. PSSA) Student Progress.
San Leandro Unified School Board Looking Closely About Our Data September 6, 2006 Presented by Department of Curriculum and Instruction Prepared by Daniel.
Meryle Weinstein, Emilyn Ruble Whitesell and Amy Ellen Schwartz New York University Improving Education through Accountability and Evaluation: Lessons.
Courageous Conversations about Equity at Century College Ron Anderson, President Linda Baughman-Terry, Counselor Nickyia Cogshell, Chief Diversity Officer.
ANDREW PARR, WASHINGTON STATE BOARD OF EDUCATION GREG LOBDELL, CENTER FOR EDUCATIONAL EFFECTIVENESS NATIONAL CONFERENCE ON STUDENT ASSESSMENT JUNE 24,
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
DATA PRESENTATION February 5, 2008 ARE WE IMPROVING? WAS 2007 BETTER THAN 2006?
1 Results for Students and Individuals with Disabilities September 2008.
Deep Data Dive & Action Planning: Increasing Capacity for Improvement May 2015 Center for Educational Effectiveness: Greg Lobdell Office of Student and.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
ESEA Flexibility: Gap Reduction Maryland Accountability Program Presentation 5 of 8.
Linking a Comprehensive Professional Development Literacy Program to Student Achievement Edmonds School District WERA December 4, 2008.
Integrating Success The Transition of All Students From High School to College November 2007 Iowa Educational Research & Evaluation Association Annual.
Mark DeCandia Kentucky NAEP State Coordinator
© CCSR ccsr.uchicago.edu. © CCSR Early Warning Indicators of High School Graduation and Dropout Elaine Allensworth.
1.Open your school’s CEE Achievement Index Summary – Section 5 2.Open your Action Planning Handbook: “Data Reflection Protocol – State Assessments” (Appendix.
The Center for Educational Effectiveness, Inc. 1 Staff Perceptions of School Conditions What Are They? Do They Reflect Reality? WERA March 2008 Pete Bylsma.
Lodi Unified School District Accountability Progress Report (APR) Results Update Prepared by the LUSD Assessment, Research & Evaluation Department.
Equity Challenges for Community and Technical College Programs and Students 2014 D. Prince Policy Research Director State Board for Community and Technical.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
BEN RARICK, STATE BOARD OF EDUCATION, EXECUTIVE DIRECTOR House Education Committee: ESSB 5491 Indicators of Educational Health.
Michigan School Report Card Update Michigan Department of Education.
NCLB / Education YES! What’s New for Students With Disabilities? Michigan Department of Education.
1.Open your school’s CEE English Language Learner Performance Analysis – Section 3 2.Open your Action Planning Handbook: Data Reflection Protocol – State.
Graduate School of Education Leading, Learning, Life Changing Emerging Trends in K-12 Education in Oregon Patrick Burk, PH.D. Educational Leadership and.
Good things that are happening for kids. Texas Students Rank #1, #2, & #3 in the nation! When comparing students by subgroups, Texas 4 th and 8 th graders.
MCC MCA Data Discoveries. What does Minnesota think is important? What do we want kids to do?  Pass important tests “Be Proficient”  Grow.
ACT ASPIRE GROWTH REPORTS. DISTRICTS AND SCHOOLS THAT PARTICIPATED IN ACT ASPIRE ASSESSMENTS (READING, MATH, ENGLISH, SCIENCE AND WRITING) WITH AN N COUNT.
CINS Community Meeting: Data Dig January 2016 APS Research & Evaluation John Keltz & Rubye Sullivan.
TACOMA PUBLIC SCHOOLS- RESEACH AND EVALUATION Webinar for Tacoma Principals May 7, 2014 (Thank you to the State Board of Education for some of the slides.
 Educator Effectiveness System  Comparative Measures  Resource for Professional Development.
A Closer Look at CRCT Data Comparing LaBelle, Cobb County School District, and State Data LaBelle Elementary (544 students enrolled) Intended use for.
Updates on Oklahoma’s Accountability System Jennifer Stegman, Assistant Superintendent Karen Robertson, API Director Office of Accountability and Assessments.
Chronic Absence in Oregon Attendance Works The Children’s Institute The Chalkboard Project ECONorthwest.
Data Overview Faculty Meeting-October 14,2014 Mission Possible: MOTIVATE, EDUCATE, GRADUATE!!!
Gallatin County High School Accountability & Assessment Data.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Determining AYP What’s New Step-by-Step Guide September 29, 2004.
Kansas State Department of Education Kansas Student Population Trends (Percent of K-12, September Unaudited Enrollment, Public Schools) 1.
Accountability in California Before and After NCLB
Welcome to the BT Super Conference
Council on Accreditation and School Improvement
Sustaining and building on the excellence of LCPS
What’s Driving Chicago’s Educational Progress?
Urban Charter Schools IMPACT in New York March 2015
OFFICE OF CURRICULUM & INSTRUCTION
Professional Growth and Student Achievement
ECHOLS COUNTY MIDDLE SCHOOL April 12, 2016 Middle School Teachers
Professional Growth and Student Achievement
Teacher Evaluation and EVAAS
Bloomington Public Schools Districtwide Achievement Report
Starting Community Conversations
Central City Elementary School
Chronic Absence in Oregon
Neptune Township School District ESEA/Title I Presentation
Tom Tomberlin Director, Educator Recruitment and Support
MIMIC ACCOUNTABILITY USING BENCHMARK DATA ! ?.
Monarch Academy, Aspire Public Schools (CA)
Presentation transcript:

Greg Lobdell, President & Director Research In Collaboration with the State Board of Education: Ben Rarick Andrew Parr, and Linda Drake

Understanding Former-ELL Work began in fall, 2013 as AI development was underway Led to conversations with SBE and OSPI staff about what I was seeing in the data Current Project: Research report / article highlighting what we see and policy implications from deeply understanding the Former-ELL subgroup

Collaboration State Board of Education – Ben Rarick – Andrew Parr – Linda Drake CEE: Greg Lobdell

Objectives of the Work Journey is just beginning… What can we learn about the Former-ELL subgroup – Size and characteristics What is the performance based on the 3 indicators used in the Achievement Index (AI) What factors matter? Size, poverty, diversity, language, grade, grade-configuration, etc. Positive Outliers- What do we see in those schools that are significant positive outliers? – Do we have other “data” we can look at for these schools? – Upward feeder pattern analysis Policy implications

What can the AI data tell us? Building is the unit of analysis Continuously enrolled students only Performance Indicators: – Proficiency: reading, math, writing, science percent of students meeting standard – Median Student Growth Percentiles: reading and math – Graduation Rate: 5-year Adjusted Cohort method

Sizing the Groups 3-Year Average Size Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8Grade 10 Former-ELL Subgroup 5,497 7,069 7,559 7,832 7,730 7,646 6,472 ELL Subgroup 8,578 6,947 6,030 5,026 3,978 3,477 2,875 ALL 61,116 61,481 62,170 62,642 62,654 62,298 59,693 Source: OSPI MSP and HSPE Reading raw data for Achievement Index. Continuously Enrolled Students only. Testing years 2011, 2012, and 2013.

Your Turn: What do you see? Snapshot of Performance Indicators – Grade by grade proficiency in all content areas for All-students, ELL, and Former-ELL – Student Growth: mean 3-year Median Growth Percentiles – Graduation Rate: 3 years of 5-year Adjusted Cohort graduation rates What Observations do you have? What other questions do you have that would help you understand this more deeply?

Are schools with reportable Former ELL populations different from schools without Former ELL populations with respect to student demographics? If so, is the pattern of differences consistent across grade levels? Group Differences – Q1

School Differences t-Test Results School Measure Former ELL Group Mean Standard Deviation Standard Error of the Mean t-Test Result PCT_MIGRANT p < PCT_ELL p < PCT_SWD p < PCT_FRL p < PCT_ p = PCT_FOSTER p < TOTAL_N p <

Group Differences – Q2 Is the academic performance of the All Students group different at Former ELL schools as compared to Non-Former ELL schools? If so, is the pattern of differences consistent across grade levels?

School Academics t-Test Results School Measure Former ELL Group NMean Standard Deviation Standard Error of the Mean t-Test Result Reading_3-Yr_Pct_Met p < Math_3-Yr_Percent_Met p = RandM_3YR_AVG_PRO p < Reading_3Yr_MSGP p = Math_3Yr_MSGP p = RandM_3YR_AVG_MGP p =

Group Differences – Q3 For schools with reportable Former ELL populations, how do the academic measures for the Former ELL students compare to the academic measures for the All Students group? How do the measures vary by content area and by school level?

Group Differences Descriptive Statistics School MeasureNMinimumMaximumMean Std. Deviation FELL_R_PRO_3YR_AVG FELL_M_PRO_3YR_AVG FELL_RandM_PRO_3YR_AVG FELL_R_MGP_3YR_AVG FELL_M_MGP_3YR_AVG FELL_RandM_GRO_3YR_AVG School MeasureNMinimumMaximumMeanStd. Deviation Reading_3Yr_Pct_Met Math_3Yr_Percent_Met RandM_3YR_AVG_PRO Reading_3Yr_MSGP Math_3Yr_MSGP RandM_3YR_AVG_MGP

Group Differences – Q4 For each of the academic performance indicators and school level, which schools have the greatest demonstrable success with their respective Former ELL students?

Highest Performing Schools School MeasureESMSHS N90 th 95 th N90 th 95 th N90 th 95 th FELL_R_PRO_3YR_AVG FELL_M_PRO_3YR_AVG FELL_RandM_PRO_3YR_AVG FELL_R_MGP_3YR_AVG FELL_M_MGP_3YR_AVG FELL_RandM_GRO_3YR_AVG FELL_GRAD_3YR_AVG

Positive Outliers- Further Investigation is Necessary to Understand K-5 Elementary – 520 students – 88% Free-reduced – 40% ELL – 11% Asian/Pac. Islander – 5% Black/African American – 51% Hispanic – 25% White

Guided Conversation… Would it be more appropriate to compare Former-ELL performance to a “never-ELL group or the all-students group? What additional analyses would be useful? – How should this study be expanded or refined?

If you think of things later…