Presentation on theme: "Www.engageNY.org 1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012."— Presentation transcript:
1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012
2 2 By the End of This Presentation…. You should be able to: –Explain what information is included in a teacher growth report –Answer some common questions about the information –Describe the differences between a teacher-level, school-level, and district-level report –Understand how districts, principals and teachers can use the information on the reports as one source of data that can help improve instruction.
3 3 Evaluating Educator Effectiveness Student growth on state assessments (state- provided) Student learning objectives Growth 20% Student growth or achievement Options selected through collective bargaining Locally Selected Measures 20% Rubrics Sources of evidence: observations, visits, surveys, etc. Other Measures 60% 4-8 grade ELA and Math Teachers & their Principals
4 4 Key Points about NYS Growth Measures –We are measuring student growth and not achievement Allow teachers to achieve high ratings regardless of incoming levels of achievement of their students –We are measuring growth compared to similar students Similar students: Up to three years of the same prior achievement, three student-level characteristics (economic disadvantage, SWD, and ELL status) Every educator has a fair chance to demonstrate effectiveness on these measures regardless of the composition of his/her class or school.
5 5 Review of Terms SGP (student growth percentile): –the result of a statistical model that calculates each student’s change in achievement between two or more points in time on a State assessment or other comparable measure and compares each student’s performance to that of similarly achieving students Unadjusted and adjusted MGP (mean growth percentile): –the average of the student growth percentiles attributed to a given educator –For evaluation purposes, the overall adjusted MGP is used. This is the MGP that includes all a teacher or principal’s students and takes into account student demographics (ELL, SWD, and economic disadvantage status).
6 6 MGPs and Statistical Confidence 87 Confidence Range Upper Limit Lower Limit MGP NYSED will provide a 95% confidence range, meaning we can be 95% confident that an educator’s “true” MGP lies within that range. Upper and lower limits of MGPs will also be provided. An educator’s confidence range depends on a number of factors, including the number of student scores included in his or her MGP and the variability of student performance in the classroom.
7 7 Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly Effective Well above state average for similar students 18–20 EffectiveResults meet state average for similar students 9–17 DevelopingBelow state average for similar students 3–8 IneffectiveWell below state average for similar students 0–2 The growth scores and ratings are based on an educator’s combined MGP. For detailed information see the webinar posted here: evaluation-in / evaluation-in /
8 8 First let’s look at a growth report about a teacher… Jane Eyre
9 9 Teacher-level Report District X School #1 Jane Eyre Teacher 1D’s Growth Score and Growth Rating are listed here Teacher 1D has a higher adjusted MGP in Math than ELA Teacher 1D does not have any growth data reported for any of the subgroups because 16 student scores are required to report any data Jane’s MGP = 47 (this is what is used to determine the growth score and growth rating) Jane’s Upper Limit = 55 and Lower Limit = 39
Common Questions about Teacher-level Reports Will an educator’s adjusted MGP always be higher than the unadjusted since the adjusted MGP takes into account not just prior student achievement but also economic disadvantage, ELL, and SWD status? Team Order of finishOverall PercentileDivisionDivision FinishDivision Percentile L199A1 M266B199 N350B233 O4 A2 P517B31 Q61A31 Division B Teams Not necessarily. Division B Percentile Let’s take the following simplified example of a relay race.
Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of Let’s take the following simplified example:
Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of ( ) divided by 5 = 44 Let’s take the following simplified example:
Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of 50 20Not > 50 30Not > 50 55Yes > 50 55Yes > 50 60Yes > 50 ( ) divided by 5 = 44 3 out of 5 students above 50 = 60% Let’s take the following simplified example: Note: in NYSED reports, no MGP is calculated for less than 16 student scores.
School-level Report District X School #1 An adjusted MGP and associated confidence range will be reported for each subject and grade level within the school. 49 % of students at School #1 scored above the State median. The Growth Score and Growth Rating for the Principal of School #1 are listed here School #1 has scores broken out by subject for grades % of the student scores are from economically disadvantaged students, and no scores from English language learners. Summary of Revised APPR Provisions Memo:
School-level Report—Detailed View District X School #1 Teacher 1E Teacher 1D Teacher 1C Teacher 1B Teacher 1A Teacher 1F Teacher 1I Teacher 1J Teacher 1K Teacher 1L Teacher 1G Teacher 1H School #1 has 12 teachers who teach grades 4-8 ELA and Math Teacher 1B has the most student scores linked to him (43 scores) 43 student scores could not be linked to any of the teachers Each teacher receives an adjusted MGP and associated confidence range that are used to determine the growth rating and growth score Teachers 1E and 1G did not receive any growth data because they are linked to less than 16 student scores
Frequently Asked Questions Why do some teachers have a combined MGP but no subject-level MGPs? Minimum of 16 student scores Student must be continuously enrolled in a course that leads to an assessment for 195 calendar days for ELA or 203 calendar days for Math.
Frequently Asked Questions Why are some students unassigned? How did the State define “unassigned”? Students are considered unassigned if the district did not provide a valid teacher-of-record for that student, or if the student-teacher linkage relationship did not meet the continuous enrollment guidelines set forth in the APPR guidance professional-performance-review-law-and-regulations/. professional-performance-review-law-and-regulations/ To meet the continuous enrollment guidelines, a student needed to be linked to a teacher for 195 calendar days for ELA, or 203 calendar days for Math in
District-level View—Page 1 NY State Summary NYS Summary Data— Included on ALL District reports Number of student scores included in calculation of State MGP NY Statewide Adjusted MGP = 52 State Median = 50 District X Statewide about 50% of ELL, SWD, and economically disadvantaged students scored above the State median.
District X Summary Data District-level View—Page 1-2 District Summary District X Summary Data—continued on next page of report Number of student scores included in calculation of district-wide MGP District-wide Adjusted MGP District X
District-level View—Page 3 List of Schools District X has two schools that have grades 4-8 ELA and Math scores School #1 School #2 District X Principal of School #1 Growth Score = 14 Growth Rating = Effective Principal of School #2 Growth Score = 6 Growth Rating = Developing
Using Growth Score results Beyond evaluation, growth score information can provide additional information to help teachers, principals and districts with instructional improvement. –Of course, these measures are only one of multiple sources of evidence to use for this purpose –The best insight comes from considering the results in the context of other information about a teacher, group of teachers, principal or group of schools.
Districts may want to: Analyze district-level information using these reflective questions: How much did our students grow, on average, compared to similar students? Is this higher, lower, or about what we would have expected? Why? How do our MGPs for each reported subgroup (ELL, SWD, economically disadvantaged students, high- and low-achieving students) compare to each other and to our overall MGP? Are there any patterns? Are the MGPs higher, lower, or about what we would have expected? Why? How do the MGPs compare by subject and across grade levels? Why might they be similar or different? What should we do to understand any surprises using other information and evidence? Do we have the right plans in place to aid in professional growth and learning for our educators?
Districts may want to: Convene principals to reflect upon their school growth results in context of other information about student learning and teacher effectiveness in their schools: –Use BOCES trainers and/or SED online resources to ensure basic understanding of the measures and what information is found on reports –Engage principals individually or in a group to reflect on questions about their school information in the context of other evidence of teacher effectiveness: How much did the students of my teachers grow, on average, compared to similar students and how does this differ across teachers? Are there differences across grades or subjects? How do my teachers’ MGPs differ across each reported subgroup? Do I see any patterns?
Districts may want to: Plan for communicating with teachers about their results: How will teachers get general information about the growth measures? –District or school-level training or self-directed use of SED resources How and when will teachers receive their individual reports? –Remember SED will have online access for individual educators later in the fall –Use BOCES trainers and/or SED online resources to ensure basic understanding of the measures and what information is found on reports
Principals may want to: Consider the reflective questions in their school-level reports: See the Principal’s Guide to Interpreting Growth Scores: content/uploads/2012/06/Principals_Guide_to_Interpreting_Your _Growth_Score.pdf content/uploads/2012/06/Principals_Guide_to_Interpreting_Your _Growth_Score.pdf See the Sample Principal Report—Annotated: content/uploads/2012/06/Principal_Sample_Growth_Report.pdf content/uploads/2012/06/Principal_Sample_Growth_Report.pdf Plan how teachers will get the information they need to understand their own growth reports
Teachers may want to: Review materials from SED about growth measures View the “Growth Model for Educator Evaluation ” Webinar: View the “Using Growth Measures for Educator Evaluation in ” Webinar: 2012/ 2012/ See the Teacher’s Guide to Interpreting Growth Scores: content/uploads/2012/06/Teachers_Guide_to_Interpreting_Your_Growth_Score.pdf content/uploads/2012/06/Teachers_Guide_to_Interpreting_Your_Growth_Score.pdf See the Sample Teacher Report—Annotated: content/uploads/2012/06/Teacher_Sample_Growth_Report.pdf content/uploads/2012/06/Teacher_Sample_Growth_Report.pdf Consider the following reflective questions: How much did my students grow, on average, compared to similar students? Is this higher, lower, or about what I would have expected? Why? How does this information about student growth align with information about my instructional practice received through observations or other measures? Why might this be?
For More Information… Please review our posted Guides for Interpreting Your Growth Scores: measures/ And the guidance on NYS’s APPR Law and Regulations: professional-performance-review-law-and-regulations/