Presentation on theme: "Welcome to the School Improvement Quarterly/ Special Education Leadership Meeting/ RAMPS Meeting Improved Data Analysis and Priority Setting- Beyond Descriptive."— Presentation transcript:
Welcome to the School Improvement Quarterly/ Special Education Leadership Meeting/ RAMPS Meeting Improved Data Analysis and Priority Setting- Beyond Descriptive Statistics and Assumptions Assisting each LEAs capacity to conduct team data-driven decision-making November 16, 2006
Essential Question Focus of the Day How do you know that specific strategies impact the needs of specific students?
Plan: communicating the goal, focusing in on the problem, finding root causes and effective solutions and planning their careful implementation. Do: implementing the creative solutions. Check/Study: examining the results and comparing them with the expected outcomes; asking the question why in and using information to explain the current outcomes. Act: acting on the outcomes from the check/study phase, to meet/exceed the desired goal (Shewhart (1939))
Planning - a problem solving approach Continues the problem-solving process by asking a series of questions that are geared toward understanding: WHY the learning needs exists, HOW the learning needs might be addressed and WHAT will constitute evidence that the selected strategies worked.
Planning - a unified process Involves all district program staff and parents in preparing the consolidated application Links the consolidated application to school plans Links the consolidated application to district strategic plans Leads to district consensus on what effective improvement planning is and infrastructures in place to support it
Planning - a problem solving approach Builds an improvement culture that: Believes all students can achieve high standards Treats current practices and policies as hypotheses to be tested. Makes improvement an on-going process, rather than an event, task, or document.
Planning - a problem solving approach Identifies learning needs of individual students Digs deeper, asking why, and clarifying underlying explanations of current student performance Understands the learning context- the interaction of students, teachers and instructional material Measures the effectiveness of improvement strategies
Some Local Sources DSTP On line Reports AYP Accountability Reports NWEA MAP Surveys Student, staff, parent E-School Plus Attendance, scheduling, climate Evaluations/Surveys Monitoring Reports Local Assessments Program Data
Digging Deeper with Data An Example in Disaggregation
School Improvement Plan Objectives – before disaggregation By the end of the 2006-2007 school year, at least 78% of all students will meet or exceed state standards by attaining proficiency or higher in Reading/Language Arts as measured by the Delaware Student Testing Program (DSTP).
2006 School Rating Status: Academic Review Adequate Yearly Progress Status: Below Target Subgroup ELA % Meet/ Exceed ELA % Participation Math % Meet/ Exceed Math % Participation Graduation Rate State Goal 6295419578.0 All Students 75M9864M9882 African American (50)(95)(35)(95) Hispanic ****** White 77M9863M99 Limited English ****** Special Education (22)(100)(19)(100) Low Income 55N9747M99
School Improvement Plan Objectives – after disaggregation Improve the percent of students meeting the standard on the Reading portion of the DSTP as follows: Grade 10 African American from 50% to 60% Grade 10 Special Education from 22% to 32% Grade 10 Low Income from 55 to 65% Maintain targets for all students at 78% or higher
Data: a quick look Educators in a high school saw African- American students' performance drop slightly below 50% on their state mathematics test, putting the school on the state's school improvement list. The decision makers immediately suggested that all African-American students, whether or not they failed the test, be assigned peer tutors.
Data: a quick look The decision makers ignored past trends, which indicated: That the African-American students' scores were on an upward trajectory. That the decline in math was so small that it could better be explained by other items than by their instructional program. Hispanic and white students who also failed the test, have shown a steady decline in math performance below for the past three years.
Data: a quick look They made decisions based on: one type of data and one way of looking at that data What data could they use to help explain pupil performance? What questions could they ask related to the outcomes?
Data – a closer look Starts with school staff looking at their intended goals and objectives and outcomes to: ID individual student learning needs (ex: identifying by name students who are at PL1 or PL2 and those whose scale scores indicate they are at the PL3 cutoff), Analyze the instructional needs comments to derive any trends in high-priority learning needs and tracing the needs first back to standards and then to grade-level expectations.
Data – All roads lead to student achievement Student Achievement Outcomes Academics Connections To Learning Perceptions Demo- graphics Program Information Data that indicate what students know and can perform. Data that show the makeup of the student population and trends in the composition. Data that indirectly impact academics (attendance, nutrition, suspension, class sizes, etc.) Data that show the level of client understanding and satisfaction. Data that provide insight into the quality of programs.
Data: a closer look The decision makers might ask: Where are we in relation to our goals? At which proficiency levels are (all of) our students performing? How did the students perform on specific content standards? Who are the students that are having difficulty with specific content standards? Why? Are there similar students having success with the same content standards? Why?
Data: a closer look The decision makers might also ask: What other data can be used to verify the explanations and define the problem more deeply? What conditions affect the student? What non-academic data is available? What are the trends/patterns in the data? Who/what strategy has been most successful with the specified population? How will the high school ultimately know the impact of the strategies (what worked and why)?
Digging Deeper with Data An Activity using your own LEA data
Digging Deeper with Data: Activity Using LEA Data Directions Tools Program Applications Reports AYP/Academic Data Process Revelations What did we learn/confirm?: data process
Data: Preparing to Dig Collect What evidence can we collect about our students learning? What evidence will show the knowledge, skills, and understandings our students have achieved? What evidence shows who is meeting/exceeding the standard and who is not? What do we know about the non-academic elements related to our students?
Data: Preparing to Dig Organize What Content Area and what standards are the focus? How are we doing in relation to District Goals?
Data: Preparing to Dig Analyze Who are our students; what are their characteristics? How did they do in relation to our goals; what did they achieve? Why are the students performing the way they are? What factors will help us understand our students? What are their educational experiences? What impact has the instruction had on our goals/outcomes? How have the resources been aligned to our goals/needs?
Data Discussions Determine the outcomes for the discussion - Define the Problem What does the data tell us about our students performance on state standards? What explanations can be given for student performance? What does our data tell us about staff understanding of the GLE?
Digging Deeper with Data: Activity Using LEA Data Share with the Group Report out regarding findings, process, how digging deeper will be brought back and used at the LEA and school level
Questions to ponder Based on what you have studied: Where are you in relation to your goals? How do you analyze your state, district and local data? How did you use student non-academic data? How might you enhance the use and analysis of all data? What do you see as trends or patterns in your data? What questions does it raise? How do you use the data to set school improvement goals? How do you engage staff in the data analysis process?
Questions to ponder – With LEA Teams Based on what you have studied: What is the sum of the information? Why are our children performing the way they are? What is preventing these children from obtaining academic success? What have we done? - What strategies did we use with the students and how did they impact their needs? Our goals? The standards?
Data Discussions – with local teams Keep the focus on improvement, not on blame. Help staff feel safe in sharing and using their classroom data Keep the focus on improvement Model collaborative Problem Solving Provide time for dialogue
Data Discussions – with local teams Keep the focus on what the data show in relation to the problem, not what staff think should be done to improve the results. First analyze the data and clarify the problem with supportive data. Keep solutions on hold until current outcomes have been explained and verified.
Data Discussions – with local teams Guard against early conclusions of why the data look like they do. Discussion focus: What questions does the data raise? What additional information do you need to address the questions?
Components for a Unified School Improvement Process Data analysis support: data driven improvement is not possible without it A school plan format that draws out linkages between underlying, explanatory needs; improvement strategies; and formative and summative measurement of results
Components for a Unified School Improvement Process Training and technical assistance for building administrators to carry out planning A top-down and bottom-up way to relate individual school needs to districtwide needs in the consolidated RAMPS application.
Food for thought How have we: Assisted/confirmed your capacity to conduct team data-driven decision- making? What do we need to do: to better enhance your capacity to conduct team data-driven decision-making? to support your local efforts with the use of data
Continuous Improvement Process Planning for Improvement RAMPS Application Self- Monitoring Evaluation and Reporting