Presentation is loading. Please wait.

Presentation is loading. Please wait.

June 20121. Big Picture Continuous Improvement Aligned Improvement June 20122.

Similar presentations


Presentation on theme: "June 20121. Big Picture Continuous Improvement Aligned Improvement June 20122."— Presentation transcript:

1 June 20121

2 Big Picture Continuous Improvement Aligned Improvement June 20122

3 Goals of the day Data literacy Schoolwide Continuous Improvement Plan Crafting June 20123

4 Housekeeping What’s in the basket? Norms Parking lot – Big – little June 20124

5 Multiple Measurement Rating Overview

6 Benefits of the Waiver from USED Ability to implement a potentially more sensitive mechanism for federally mandated statewide accountability under ESEA – the Multiple Measurement Rating (MMR) New statewide targets for AYP driven by actual performance rather than a linear, time-delineated goal driven by NCLB’s 2014 deadline Elimination of prescriptive NCLB sanctions for all schools regardless of performance context Elimination of many required set-asides tied to NCLB sanctions at the school and district level, including ineffective supplemental educational services Differentiated improvement planning requirements for schools

7 Foundation for Statewide Accountability Remains the Same Minnesota’s Academic Standards Statewide Assessments in Reading, Mathematics, and Science – MCA-IIIs moving forward Public reporting Disaggregated data with an emphasis on achievement gaps Adequate Yearly Progress determinations (with new differentiated targets

8 Minnesota’s Multiple Measurement Rating (MMR) New accountability system emphasizes students growth and closing the achievement gap in addition to proficiency Differentiates accountability for schools based upon performance across multiple domains included in MMR Adds recognition of high performance Returns primary responsibility for improvement efforts to districts MDE will focus on the schools with the greatest needs and lowest performance

9 Recognition, Accountability and Support MDE will assign Title I schools to three federally required accountability categories: – Reward Schools (Top 15% of Title I schools) – Priority Schools (Bottom 5% of Title I schools; three year designation) – Focus Schools (10% of Title I schools contributing most to state’s achievement gaps; three year designation) MDE has also created two additional categories to recognize schools or promote improvement: – Celebration Schools (Title I schools between 60-85 th percentile) – Continuous Improvement Schools (Title I schools in the bottom quartile not already identified as Focus or Priority Schools)

10 MMR’s Four Components All Minnesota schools will receive an annual Multiple Measures Rating (MMR) comprised of up to four components: – Proficiency – Student Growth – Achievement Gap Closure – Graduation Rate (for schools with graduating classes) Accountability designations only apply to schools receiving federal Title I aid under NCLB (ESEA) Schools are ranked in each domain by grade level cluster

11 Total MMR Each domain is worth 25 points. The MMR is generated by dividing the total number of points earned by the total number of points possible. For most elementary and middle schools, 75 points possible. For most high schools 100 points possible. The total MMR is a 0-100 percentage for all schools reflecting the proportion of points.

12 Focus Rating In addition to an MMR, every school gets a Focus Rating to identify Focus Schools. The Focus Rating measures proficiency and growth of students of color and students receiving special services (EL, Special Ed, Free and Reduced Price Lunch) Focus Rating combines Achievement Gap Reduction and Focused Proficiency Each domain is worth 25 points, for 50 possible points

13 Proficiency Proficiency domain uses AYP index model. Schools earn points based on a weighted percentage of subgroups making AYP. Weighting is based on the size of subgroups. Unlike in AYP calculation, in MMR Proficiency, groups can’t make AYP through Safe Harbor.

14 Growth Growth measures ability of schools to get students to exceed predicted growth. Growth predictions based on students’ last assessment result. Predictions generated by looking at two cohorts of students, where they scored one year and where they scored the next year. Student growth score based on being above or below prediction at each score point. School growth score is average of student growth scores.

15 Achievement Gap Reduction Measures the ability of schools to get higher levels of growth from lower-performing subgroups than statewide average growth for higher-performing subgroups. Growth of individual subgroups of students of color compared to growth of white students, Els compared to non-Els, FRPs compared to non-FRPs, SPED compared to non-SPED. Subtract schools’ growth scores for lower-performing groups from statewide averages of higher-performing groups. Negative score indicates success.

16 Graduation Rate Uses same methodology as Proficiency domain. Looks at the percentage of subgroups that made AYP in graduation rate. Current AYP grad rate targets are 85%. Targets are changing next year. Groups can only get credit for meeting the target, not through year-to-year improvements.

17 Focused Proficiency Like Proficiency Domain, Focused Proficiency uses AYP index model. Schools earn points based on a weighted percentage of subgroups making AYP – but excludes the All Students subgroup and the White subgroup. Weighting is based on the size of subgroups.

18 Exit Criteria Priority Schools: Two consecutive years out of the bottom 25 percent on the MMR (‘13 & ‘14). Focus Schools: Two consecutive years out of the bottom 25 percent on the FR (‘13 & ‘14). SIG Schools: Opportunity to exit at end of grant (‘13) if out of bottom 25 percent on MMR that year. Priority or Focus: Immediate exit if a Reward School after any year starting in ‘13.

19 2011 25 th Percentile Elementary Schools: MMR 33.81%; FR 42.55%; Lowest Reward MMR 73.30% Middle Schools: MMR 18.68%; FR 42.96%; Lowest Reward MMR 79.05% High Schools: MMR 22.05%; FR 31.99%; Lowest Reward MMR 76.15% Numbers will be different every year.

20 Annual MMR and FR MDE must run AYP results based upon the newly approved targets – Target Date July 18, 2012 Test results will come out August 1, 2012 MMR and FR results – August 27, 2012 Media release – August 29, 2012 MMR and FR Public Release – August 30, 2012

21 SCIP and MMR/FR identifications SPPS has received permission to use the SCIP in lieu of the state’s improvement plan format Emphasis on a “proficient” SCIP allows us to ensure a level of quality to the plan Continuous improvement schools will also face the same Duration, intensity, frequency are key criteria for supporting Focus Schools

22 Next Steps Plans will be submitted to MDE no later than September 1, 2012 Some form of parental communication has to be sent but MDE has not yet indicated the “WHAT” The 20% Title I set-aside has to be addressed in the Title I section of your SCIP – Expected alignment between goals, budget, and action plan

23 Stars in the Elevator Protocol for Minnesota’s new Multiple Measurement Rating (MMR) and Focus Rating (FR) Rashmi Vashisht, Data Coach, School & Program Improvement Joe Munnich, Policy, Planning and Intergovernmental Relations June 201223

24 Introduction June 201224

25 Objectives I can name the 3-4 components of MMR – Proficiency – Gap reduction – Growth – Graduation rate (HS only) I can name the 2 components of FR – Focus proficiency – Gap reduction I can report my school’s position relative to other schools in the state based on the state’s Multiple Measurement Rating (MMR) and Focus Rating (FR) June 201225

26 Elevator Worksheet June 201226

27 Your Data: MMR/FR Score Sheets June 201227 BLUE GREEN PURPLE RED YELLOW PURPLE

28 Elevator Worksheet June 201228

29 RAFT R – Role: School Leader A – Audience: Families, staff, community members F – Form: Elevator Speech T – Topic: Multiple Measurement Rating and Focus Rating June 201229

30 Elevator Speech Complete the writing prompt In MMR in 2011: – We got the most points from… – We got the fewest points from… In Focus Rating in 2011: – We got more points from… – We got less points from… June 201230

31 Report out: Going up In MMR - Stand up if your highest was: – Proficiency – Growth – Gap reduction – Graduation (HS only) In Focus Rating - Stand up if your highest was: – Gap reduction – Focus proficiency June 201231

32 Reading between the dots: MAP Progress Toward Proficiency Objective: A protocol to analyze data to determine the rate of progress toward a target.

33 Framing Issues and Key Concepts Managing the gap between current levels of proficiency and expectation is what our mission is all about. The two critical pieces of information we need are: How big is the gap? How much time do we have to close it? The answers to these questions define our instructional mission.

34 Steps (a) and (b) June 201234 6 6 8 8 16 Step (a) Grade Level: 6th Content Area: Reading GradeFallSpring 6816 Step (b)

35 (C): Progress toward proficiency June 201235 8 8 16 8 8 =8

36 (D): Average rate of increase June 201236 8 ÷ 8 8 1 1

37 Finding the Average Rate of Progress toward Proficiency MCA-II Reading / All Students 2011-12 District Target = 75% Fall 2011 16.0 – 8.0 = 8.0 8.0% divided by 8 months = 1% rate of growth per month 8.0 16.0 Winter 2012 Spring 2012

38 (e) and (f): To Proficiency June 201238 75 16 59 1 1

39 (g) and (h): Time to Proficiency June 201239 It would take 4 years 11 months to close the gap.

40 Next Steps: Are you happy with: – % of students on target for proficiency? – Based on your calculations, is this rate of progress adequate or acceptable? Why or why not? Implications for your SCIP: Given that we must increase the % of students who move to proficiency at an accelerated pace, how have you done with the rate over the past year and what does this information mean to you for the next 5 years? Use this protocol to analyze your own data.

41 Possible Conclusions What we have been doing has not been predictably effective for ALL of our kids If we want to become more effective, we can’t do the same things harder, faster or longer We need to do different things that are more effective

42 So… how? 1. Decide what is important for students to know. 2. Teach what is important for students to know. 3. Keep track of how students are showing what they know. 4. Make changes according to the data and results you collect! David Tilly, 2005

43 Evaluate Response to Instruction & Intervention (RtI 2 ) Problem Analysis Validating Problem Identify Variables that contribute to problem Develop Plan Define the Problem Defining Problem/Directly Measuring Behavior Implement Plan Implement As Intended Progress Monitor Modify as Necessary Problem Solving Process

44 Slicing the Pie: Analyzing the Viewpoint Growth vs. Proficiency Report

45 Objectives: To access the Viewpoint Growth vs. Proficiency Report To examine growth vs. proficiency by ethnicity To consider how this data might support the SCIP

46 Inquiry Questions: What percentage of my students (overall) met targeted growth and are proficient? Did not meet growth and are not proficient? Which student group (ethnicity, grade level) had the most students making targeted growth? Which student group (ethnicity, grade level) had the least number of students making targeted growth? Who are they? How can we act?

47

48

49 MAP Growth vs Proficiency School Name 2011 - 2012 Status: Active Students Only Subject: Reading Start Season: Fall 2011 Test Status: Tested (Both Seasons) Scale: SPPS Targets - All Students End Season: Spring 2012

50 Met Growth Below Growth Proficient Not Proficient

51 Met Growth Below Growth Proficient Not Proficient 36.4% 9% 27.2%

52 State some: Observations 1.45.4% of All students in grades 3-6 are proficient (met target) in MAP Reading by Spring 2012 2.63.6% of our students made growth! 3.54.4% of All students in grades 3-6 are not proficient- this is over half our students! Inferences 1.Last year, we had 36.6% proficiency and 2.58.4% growth- What was new this in reading that made a difference: Literacy PLCs with Data Teams, literacy coach 3.Based on the Spring MAP, the lowest strand is word recognition/ vocabulary, same as last year.

53 Questions that remain How did student groups by ethnicity do compared to all in proficiency/ growth? Who are the individual students who were “in the red slice”- not proficient/below growth?

54 Click on the filter

55 Choose “Modify Filters”

56 Drop down menu has options Choose 1

57 Click the save icon

58 Close the window Note: click to load the report; click again to view the report

59 MAP Growth vs Proficiency School Name, 2011 - 2012 Status:Active Students Only Attribute: Ethnicity-Hispanic Click on the slice of the pie to get individual student data

60 Below Growth; Not Proficient

61 Let’s bring it to the SCIP

62

63 * Every system is perfectly aligned for the results it gets.


Download ppt "June 20121. Big Picture Continuous Improvement Aligned Improvement June 20122."

Similar presentations


Ads by Google