Van Hise Elementary School Review of Data School Improvement Process March 3, 2009.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Principals ESO Focus on Professional Development October 2008.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Crestwood Elementary School Review of Data School Improvement Process October 26, 2009.
Analysis of Data at South Paulding High School from 2008 to 2010 Kim Huett, Anne Roycroft, Gina Smeeton, and Robin Wofford.
Data for Student Success Comprehensive Needs Assessment Report “It is about focusing on building a culture of quality data through professional development.
N O C HILD L EFT B EHIND Testing Requirements of NCLB test annually in reading and mathematics in grades 3-8 test at least once in reading and mathematics.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
San Marino Unified School District World Class Schools Academic Accountability Measures.
Grape Creek Report Card Grape Creek Intermediate School Robin Graham Erica Crawford Dee Ann Shelton Carol Anderson.
Hawthorne Elementary School Review of Data School Improvement Process Fall, 2009.
Mark DeCandia Kentucky NAEP State Coordinator
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
Understanding Wisconsin’s New School Report Card.
PRESENTED TO THE BOARD OF EDUCATION AND PUBLIC APRIL 15, 2013 New Jersey Department of Education School Performance Reports.
Valentine Elementary School San Marino Unified School District Standardized Testing and Reporting (STAR) Spring 2009 California Standards Test.
September 12, 2014 Lora M. McCalister-Cruel BDS District Data Coach Bay District Schools Data Analysis Framework.
Planning, Assessment & Research Analysis and Use of Student Data Part I.
School Performance Index School Performance Index (SPI): A Comprehensive Measurement System for All Schools Student Achievement (e.g. PSSA) Student Progress.
Arizona’s Federal Accountability System 2011 David McNeil Director of Assessment, Accountability and Research.
District Assessment & Accountability Data Board of Education Report September 6, 2011 Marsha A. Brown, Director III – Student Services State Testing and.
A Parent’s Guide to Understanding the State Accountability Workbook.
DLT September 28, State Indicators and Rating for OFCS (have) Key Factors and Points to Keep in Mind (have) This power point presentation (will.
WBCSD District Strategic Goals Update August 10, 2015.
Making Sense of Math Learning Progressions District Learning Day Friday, September 18, 2015.
Salt Creek School District 48 Annual ISBE School Report Card Board of Education Report October 30, 2012.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Ohio’s New Accountability System Ohio’s Response to No Child Left Behind (NCLB) a.k.a. Elementary & Secondary Education Act a.k.a. ESEA January 8, 2002.
Helping EMIS Coordinators prepare for the Local Report Card (LRC) Theresa Reid, EMIS Coordinator HCCA May 2004.
Welcome and Introductions H.O.B. – Helping Our students “BE” successful!
Mark DeCandia Kentucky NAEP State Coordinator
NAEP 2011 Mathematics and Reading Results Challis Breithaupt November 1, 2011.
Lodi Unified School District Accountability Progress Report (APR) Results Update Prepared by the LUSD Assessment, Research & Evaluation Department.
Annual Student Performance Report September
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 WASL Reading: Grade 6. 2 WASL Reading: Where are we now? Questions to answer: How are we doing? How are we doing? Compared to district & state?
1 WASL Mathematics: Grade 7. 2 WASL Math: Where are we now? Questions to answer: How are we doing? How are we doing? Compared to district &
Michigan School Report Card Update Michigan Department of Education.
Ch á vez Elementary School Review of Data School Improvement Process School Year
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Using Assessments to Plan for Learning MEAP and MME Data Collection.
School Report Cards Grades 3 through 12 Missy Wagner Curriculum Coordinator Theresa Gray School Improvement Program Coordinator Data retrieved/prepared.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
Montgomery County Public SchoolsWoodlin Elementary SchoolMontgomery County Public SchoolsWoodlin Elementary SchoolMontgomery County Public SchoolsWoodlin.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Challenges for States and Schools in the No.
AYP and Report Card. Big Picture Objectives – Understand the purpose and role of AYP in Oregon Assessments. – Understand the purpose and role of the Report.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
Annual Progress Report Summary September 12, 2011.
A Closer Look at CRCT Data Comparing LaBelle, Cobb County School District, and State Data LaBelle Elementary (544 students enrolled) Intended use for.
Planning, Assessment & Research Analysis and Use of Student Data Part I Administrative Academy Los Angeles Unified School District.
Accountability Training Review Agenda for Today: Review of calculation changes and what’s new In depth review of Closing Gaps calculations Graduation Rates.
Anderson School Accreditation We commit to continuous growth and improvement by  Creating a culture for learning by working together  Providing.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
Using Data to Identify Priorities in an Accountability System Jared E. Knowles, Wisconsin Department of Public Instruction.
Elizabeth Burmaster, State Superintendent Wisconsin Department of Public Instruction November 2004 No Child Left Behind Act of 2001 Implementation of the.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Adequate Yearly Progress [Our School District]
Conversation about State Report Card November 28, 2016
State of Wisconsin School Report Cards Fall 2014 Results
Metropolitan Nashville Public Schools
Bridgewater-Raritan Regional School District
2016 READY ACCOUNTABILITY DISTRICT RESULTS
Travis Wright, Ed.D April 26, 2018
Adequate Yearly Progress [Our School District]
New Statewide Accountability System
Anderson Elementary School
Lauren Kinsella Dr. Wright ITEC 7305
Using Assessment Results to Inform Teaching and Learning
State of Wisconsin School Report Cards Fall 2014 Results
Presentation transcript:

Van Hise Elementary School Review of Data School Improvement Process March 3, 2009

Why use data? How should we use it? Data isn’t meant to replace our knowledge, experience, insights, and intuitions. Data complements each of these, helping us avoid “blindspots” or generalizations that need a more sophisticated understanding. Data is best used as a source of information that leads to reflection. Numbers are numbers, but their meanings are determined through reflective analysis and thoughtful discussion.

How will we respond to the data we review today? As we approach each data source, consider your state of mind: what assumptions do you bring to the data? What predictions are you making? After reviewing sets of data, ask yourself: –What important points seem to “pop out?” –What are some of the patterns and trends that emerge? –What seems to be surprising or unexpected? Then consider the information that’s missing? What other information should be gathered? In what directions do we need to examine the data in greater detail or from another perspective?

and remember… As we examine the data, there are two tendencies that sometimes occur: 1)To focus on only the negative or the needs that are apparent and to ignore strengths and positive “assets” in the school. 2)To be offended or get defensive with data that points out needs, challenges, or concerns.

How has the overall enrollment changed across time? marks the highest enrollment level (previous high level was 329 in 1999)

What do we know about our students? ENROLLMENT BY LOW INCOME AT VHES 20%

How does VHES’s level of economically disadvantaged students compare to the District and State?

This Year’s Enrollment by Low Income: MMSD September, 2008

Data on our students… RACIAL/ETHNIC DIVERSITY AT VHES 63% 22% 9% 6%

How does VHES’s diversity of students compare to the District and State? School Year

How did VHES’s student needs compare to the District?

Other data you may want to look at later: Mobility rates (whole school and disaggregated by student groups) Home factors – number of parents in household and highest education level

What do we know about how our students are engaged? ATTENDANCE RATES FOR ALL STUDENTS: VHES AND MMSD ELEMENTARY SCHOOLS All student groups were above the 94% goal last year.

Another indicator of engagement: Behavior-related data: both suspension data and office referral data.

As we begin looking at measures of learning, we will begin with the SAGE Report Data Objectives met the 80% standard. - 7 Objectives were below 80%.

How have our student performed on the PLAA over time? How does this compare to the District average?

How have our student performed on the PMA over time? How does this compare to the District average?

Adequate Yearly Progress ReadingMath Starting Point %37% %37% %37% Intermediate Goal %47.5% (Begin new 3-8 tests) %47.5% %47.5% Intermediate Goal %58% %58% %58% Intermediate Goal %68.5% Intermediate Goal %79% Intermediate Goal %89.5% Goal: All Proficient % Annual Measurable Objectives % Proficient/Advanced

When we consider the “high stakes” test for reading, how did our students perform? How do our Proficiency/Advanced levels compare to the District and State? WKCE – Reading, 2007: Proficiency/Advanced % Criteria that determines a school’s status (AYP): Reading – 74% 3 rd % 4 th % 5 th %

When we consider the “high stakes” test for reading, how did our students perform? How do our Proficiency/Advanced levels compare to peer schools? WKCE – Reading, 2007: Proficiency/Advanced %

When we compare the students that we could instructionally impact (FAY) to Wisconsin schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in reading?

When we compare the students that we could instructionally impact (FAY) to Dane County schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in reading?

Looking at the Six-Trait Writing Sample results, how have our third graders performed over time?

Looking at the Six-Trait Writing Sample results, how did our third graders compare to the district average?

Looking at the Six-Trait Writing Sample results, how have our fifth graders performed over time?

Looking at the Six-Trait Writing Sample results, how did our fifth graders compare to the district average?

When we consider the “high stakes” test for mathematics, how did our students perform? How do our Proficiency/Advanced levels compare to the District and State? WKCE - Mathematics, 2007: Proficiency/Advanced % Criteria that determines a school’s status (AYP): Math – 58%

When we consider the “high stakes” test for mathematics, how did our students perform? How do our Proficiency/Advanced levels compare to peer schools? WKCE – Math, 2007: Proficiency/Advanced %

When we compare the students that we could instructionally impact (FAY) to Wisconsin schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in math?

When we compare the students that we could instructionally impact (FAY) to Dane County schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in math?

Improvement-based school performance measures

Value added measures Extra WKCE points gained by students at a school on average relative to observably similar students across district Value added of +3 means students gained 3 points more than the district average Value added of -3 means students gained 3 points less than the district average

Understanding VA Data Average student gain on WKCE relative to district average, with adjustments for: –Shape of the test score scale –Gender, race, disability, low-income status, language, parents’ education –Mid-year (November) testing –Patterns in gains from one year to the next

Value Added and Proficiency High Proficiency And High Value Added High Proficiency And Low Value Added Low Proficiency And High Value Added Low Proficiency And Low Value Added

Other assessments to look at in the future: Science and Social Studies Test Results (WKCE, Grade 4) Other report card information Six-Trait Writing Results

NMCS’s Special Education Information includes: Placement/Referral Data “Risk Factor” Ratio Least Restrictive Environment

When it comes to measurements of relationships… School Climate Survey Responses from all students (grades 3-5), parents, and all staff. Comparisons to District, to previous year, and internally between demographic groups.