Presentation is loading. Please wait.

Presentation is loading. Please wait.

Educator Evaluation: Challenges and Opportunities

Similar presentations


Presentation on theme: "Educator Evaluation: Challenges and Opportunities"— Presentation transcript:

1 Educator Evaluation: Challenges and Opportunities
Massachusetts Association of School Superintendents May 19, 2011 updated

2 The three-year Superintendent Induction Program
Context The three-year Superintendent Induction Program An ESE – M.A.S.S. Partnership Four Focus Areas Focused Instructional Leadership Collaborative Relationships and Effective Leadership Teams Strategic Management of Human and other Resources Robust System of Supervision and Evaluation

3 Agenda Goals of the Proposed Regulations
Key Features of the Proposed Regulations Components of the Model System Challenge: Assessing Educator Impact on Student Growth using Multiple Measures Challenge: Self-Assessment and Goal Setting Challenge: Timeline for Implementation

4 Goals of Improved Educator Evaluation
Promote leaders’ and teachers’ growth and development Place student learning at the center using multiple measures of student learning, growth and achievement Recognize excellence in teaching and leading Set a high bar for professional status (tenure) Shorten timelines for improvement These goals were developed by the task force and they are goals that few would disagree with. The challenge is going to be operationalizing these goals.

5 Key Features 5-step Evaluation Cycle, starting with educator self-reflection and goal setting 3 Categories of Evidence: Multiple measures of student learning, growth and achievement Products of practice Educator’s collection of other evidence Most educators experience evaluation passively – it’s something done TO them, rather that with them, with the goal of supporting them. In many places, evaluation is based solely on observing 1-2 classes, and these observations are almost always announced in advance, which results in the “dog and pony” show. But, there are exceptions. Some districts in MA are already innovating around evaluation, and have developed systems that work to support educators – but we want to make sure that these exceptions become the rule. We really want evaluation to be a process by which educators reflect on their own practice, set goals, and receive support in achieving those goals through improving personal practice.

6 Features, cont’d 4 Standards with “core” indicators for administrators and teachers 4 Ratings on performance: exemplary, proficient, needs improvement, unsatisfactory 3 Ratings of impact on student learning, with focus on learning gains: high, moderate, low Different Paths & Plans depending on career stage and performance

7 A 5-Step Evaluation Cycle
Every teacher participates in a 5-step evaluation process. It begins with the educator who is being evaluated – either the teacher or principal – engaging in a self-reflection and self-assessment process. Then, with his supervisor, the educator analyzes his reflection, develops some goals and plans for the coming year. These are more or less intense, depending upon prior level of performance and how new they are to the profession. Then, the plan is implemented, and the cycle moves forward with formative assessment and evaluation and ultimately, with summative evaluation. And then the cycle starts all over again! 7

8 Rubrics for 4 Statewide Standards and Indicators
Administrators Teachers Curriculum, Instruction & Assessment Management & Operations Family & Community Partnerships Professional Culture Curriculum, Planning & Assessment Teaching All Students Family & Community Engagement A keystone of the proposed regulations is a revised and streamlined set of educator standards and core indicators for teachers and administrators. These will replace the current principles of effective practice. These principals are parallel – but not identical. Among other things, they raise the importance of and community engagement and of professional culture. Districts must use them, but may add additional standards and indicators identified through the collective bargaining process. They are used to focus evaluations on standards of practice that reflect what teachers and leaders must know and be able to do. In the proposed framework, an educator’s performance ratings will be linked to these standards of practice as well as to their attainment of goals they have set in the areas of their own professional practice and their impact on student learning. Eventually, these new standards will inform preparation program standards and licensure that will improve the alignment across preparation, licensure, and evaluation—where it makes sense.

9 3 Categories of Evidence
Multiple measures of student learning, growth and achievement Products of practice, including observation of practice (announced and unannounced) Educator’s collection of other evidence, including analysis of feedback from: Students Parents Staff (for administrators) Multiple measures: Products of practice: DESE surveyed the 22 urban districts about whether unannounced observation is practiced in their district. In over half of these districts, unannounced observation was prohibited either contractually, or based on long-standing past practice. Educator’s collection of other evidence: Nationally, it is rare practice for administrators to be evaluated by their staff.

10 The summative performance rating (exemplary, proficient, needs improvement, unsatisfactory)
Based on: Performance against each of four standards and Progress toward meeting student learning and professional practice goals Educator’s impact on student learning, growth and achievement “counts” in standards on curriculum, instruction and assessment Every educator will receive one of four summative performance ratings – exemplary, proficient, needs improvement or unsatisfactory. These ratings are based on performance against each of four standards (For an administrator – curriculum, instruction and assessment; management & operations; family and community partnerships; professional culture. For a teacher – curriculum, planning and assessment; teaching all students; family and student engagement; professional culture) and progress toward goals. In addition, the educator’s impact on student learning, growth and achievement contributes to the summative assessment.

11 Paths & Plans: Differentiated by Career Stage and Performance
Educators in their first three years: Development Plan (one year) Performance rated as proficient or exemplary: Self-directed Growth Plan (one or two years) Performance rated as in need of improvement: Directed Growth Plan (one year) Performance rated as unsatisfactory: Improvement Plan (up to one year)

12 Decision Flow for Experienced Educators
12

13 Linking Student Learning and Educator Practice
Rating of Educator Practice Exemplary Proficient Needs Improvement Unsatisfactory Low Moderate High Impact on Student Learning (multiple measures of student learning, including MCAS student growth percentiles where available, with a focus on learning gains)

14 What Happens When There’s a Discrepancy?
Low Rating of Educator Practice BUT Moderate or High Impact on Student Learning Evaluator reviews discrepancy with educator. Evaluator’s supervisor considers discrepancy trends in evaluating evaluator.

15 What Happens When There’s a Discrepancy?
High Rating of Educator Practice BUT Low Impact on Student Learning Educator has 1-year growth plan focused on discrepancy. Evaluator’s supervisor MUST review rating. Superintendent has final authority to determine summative rating.

16 Agenda Goals of the Proposed Regulations
Key Features of the Proposed Regulations Components of the Model System Challenge: Assessing Educator Impact on Student Growth using Multiple Measures Challenge: Self-Assessment and Goal Setting Challenge: Timeline for Implementation The state law is clear that educator evaluation is bargained locally. However, DESE is delineating principles that are broader than those that were laid out in the last set of educator evaluation regulations in 1995, so that there will be more “non-negotiable” elements to the educator evaluation system.

17 Key components of the Model planned
Contract language describing: Process Timelines Collection of evidence A rubric for each standard and indicator that: describes professional practice vividly and clearly at four levels of performance Is differentiated for different roles, e.g., classroom teacher, caseload teacher, counselor, principal Includes core and supplementary indicators DESE is working on developing a robust model system based on the final version of regulations adopted by the Board. While we cannot mandate that districts adopt a specific evaluation system, we are providing an incentive to do so: much of the professional development workshops and online resources will be built around these rubrics. The model will include: 1) Contract language which describes the actual process (including various paths and plans), the timeline for each of the four educator plans, and Rubrics that provide detail for a what a district might need for teachers (both professional status and caseload), administrators (principals and others), and superintendents.

18

19 Other components of the Model
Templates for: Self-assessments Goals Plans Developing educator plan Self-directed growth plan Directed growth plan Improvement plan

20 Other components of the Model, con’t
Guidelines for developing and using multiple measures of student learning, growth and achievement Guidelines for determining low, moderate and high impact on student learning Examples and Resources on: Multiple measures of student learning Determining educator impact Examples of ways to collect and use feedback from: Students Staff (for administrators) Parent feedback (initially for administrators)

21 Stakeholder Feedback from Students, Staff and Parents
Focus on school-wide feedback (initially) Students, starting in grade 6 (?) A possibility: ESE-supported on-line data collection

22 Stakeholder Feedback Examples
New York City Learning Environment Survey Garners annual feedback from parents, students and teachers. Results factor into school progress report rating and help schools better understand their own strengths and target areas for improvement.  Massachusetts Teaching, Learning and Leading Survey (Mass TeLLS) Taken by 40,000 teachers and administrators in 2008. Educators provided views about teaching and learning conditions, including leadership, empowerment, facilities and resources, PD, and time, in their schools. Boston Student Advisory Council (BSAC) Student to Teacher Constructive Feedback Students provide annual, anonymous feedback about individual teachers

23 Supports planned for the Model
Annual updates Orientation tools and resources for a variety of audiences On-line and hybrid professional development on observation, goal setting, etc. Networks of Practice Web-based rubric “library”

24 Supports for the Model System
Outreach to state associations, e.g., Principals (MESPA and MSSAA) Department Heads and Supervisors (MASCD) Counselors (MASCA) ESL (MATSOL) Art (MAEA) Training and support for regional collaboratives to develop and share expertise and resources among member districts

25 Agenda Goals of the Proposed Regulations
Key Features of the Proposed Regulations Components of the Model System Challenge: Assessing Educator Impact on Student Growth using Multiple Measures Challenge: Self-Assessment and Goal Setting Challenge: Timeline for Implementation

26 What are “multiple measures”?
MCAS growth percentile data, when applicable MEPA growth scores, when applicable Other assessments comparable district-wide across grade or subject, including approved commercial assessments and district-developed pre/post unit and course assessments Teacher-developed assessments (individual and/or team, school)

27 MCAS Growth to grade 7: Three students
Gina Advanced SGP 80 to 99 65% Proficient 60 to 79 40 to 59  230 20 to 39 Needs Improvement 35% 1 to 19 Because there’s been much talk about “using MCAS”, I thought it would be helpful to provide a “primer” on the MCAS Student Growth Percentile (SGP) that will be used as ONE (only one) of the measures for assessing educator impact on student learning. (Please note that it applies to all schools – but only 17% of teachers). These next slides are from ESE’s orientation materials on the MCAS Student Growth Percentile (SGP) 1. Each student’s rate of change is compared to other students with a similar test score history (“academic peers”) 2. The rate of change is expressed as a percentile. How much did Gina improve in mathematics from 5th grade to 6th grade, relative to her academic peers? If Gina improved more than 35 percent of her academic peers, then her student growth percentile would be 35. 3. Gina scored 230 on the ELA MCAS in grade 5 in 2006, and 230 in grade 6 in We looked at all the students who scored similarly to her to see how they did in ELA in grade 7. Some of those students’ scores dropped substantially, to Warning/Failing. Some students with this test score history get all the way up to Advanced. The typical youngster (in the orange part of the triangle) improves a little. Note that this is real data, not projections – this is how students with this test score history actually performed in 2008. 5. So what happened to Gina? In grade 7, Gina again scores However, look at the distribution of her academic peers. Most of her academic peers scored higher than she did. In fact, 65% of them scored higher than she did. She outscored 35% of her academic peers. Therefore, Gina has a student growth percentile of 35. 6. Without the growth model, it would seem as if a student scoring 230, 230, 230 is staying put...neither improving nor declining. However, we can now see from the growth model that this student improved less than 65% of her peers. 7. Note: The SGP groups on the right define the meaning of the bands in the triangles. The top part of the triangle – dark brown – corresponds to the scaled scores for students who achieved 80th to 99th percentile growth. The next triangle corresponds to those with 60th to 79th percentile growth, etc. SGPs between 40 to 59 are typical Warning/Failing 2006 2007 2008 source:

28 Growth to grade 7: Three students
Harry Advanced 75%  244 Proficient 25% Needs Improvement Harry scored 248 in grade 5, and 248 in grade 6. He’s a high scorer. He’s compared with all the students that demonstrated a similar pattern of scoring. 3. In grade 7, he scores 244, which doesn’t seem like a big dip in achievement. However, look at the distribution of his academic peers. Most of his academic peers scored higher than he did. In fact, 75% of them scored higher than he did. He only outscored 25% of his academic peers. Therefore, Harry has a student growth percentile of 25. 4. SGP gives us a way to understand growth for high achieving students. High achieving students can grow at a low, moderate or high rate, just as low achieving students can. Warning/Failing 2006 2007 2008 source:

29 Growth to grade 7: Three students
Ivy Advanced Proficient 8%  226 Needs Improvement 1. Here’s a low achieving student. Ivy scored 214 in grade 5, and 214 in grade 6. She’s compared with all the students that demonstrated a similar pattern of scoring. 2. In grade 7, she scores While she’s still in the Needs Improvement performance level, look at the distribution of her academic peers. She outscored 92% of her academic peers. Only 8% of her academic peers scored higher than she did. Therefore, Ivy has a student growth percentile of 92. to 226 seems like a modest improvement, but a growth percentile of 92 indicates that this student grew an unusually high amount between grades 6 and 7. Without the growth model, we would have known Ivy had made a big improvement, but we would not have been able to understand how really big it was. 92% Warning/Failing 2006 2007 2008

30 Growth to grade 7: Three students
Gina, Harry, and Ivy Advanced Harry Proficient Gina Needs Improvement Ivy 1. Each student – high achieving Harry, moderate achieving Gina, and low-achieving has the exact same opportunity to grow at the 99th percentile and the 1st percentile. Note: The trend in the range is pointing up slightly. That indicates that the median student in grade 6 ELA tends to score a bit better in grade 7 ELA, no matter if they’re in the Warning, Needs Improvement, Proficient, or Advanced category in grade 6. Warning/Failing 2006 2007 2008 source:

31 Growth to grade 7: Three students
English language arts Grade 5 2006 Grade 6 2007 Grade 7 2008 SGP 2008 Gina 230 230 230 35 Harry 248 248 244 25 1. This is a numerical representation of the graphical data we just showed. 2. The student with the lowest absolute score actually demonstrated the most growth (and vice versa). 3. Low-achieving Ivy is high-growing Ivy; High-achieving Harry is low-growing Harry. Ivy 214 214 226 92 source:

32 Median student growth percentile
Last name SGP Lennon 6 McCartney 12 Starr 21 Harrison 32 Jagger 34 Richards 47 Crosby 55 Stills 61 Nash 63 Young 74 Joplin 81 Hendrix 88 Jones 95 Imagine that the list of students to the left are all the students in your 6th grade class. Note that they are sorted from lowest to highest SGP. The point where 50% of students have a higher SGP and 50% have a lower SGP is the median. Median SGP for the 6th grade class Rules of thumb: 1. Typical student growth percentiles are between about 40 and 60 on most tests. 2. A student or group outside this range has higher or lower than typical growth. 3. Differences of fewer than 10 student growth percentile (SGP) points are likely not educationally meaningful. Reminder: to calculate median – sort student SGPs from lowest to highest and find the middle student. Note: this “class” of students is composed of 1960’s era rockers, e.g., “Jones” is Davy Jones source:

33 Challenging a Level 4 School
Median Student Growth Percentile English Language Arts Murkland Lincoln Sokolovsky source:

34 A 5-Step Evaluation Cycle
34

35 A Case Study Lowell Public Schools 35
Here is an example of how one district is using Student Growth Percentiles to understand its strengths and weaknesses. Lowell likes to use distribution charts that show the percentage of students in each quintile (20%) of SGP distribution. By definition, 20% (or 1/5) of all the students statewide at each grade/subject fall in each quintile: very high, high, moderate, low and very low. Lowell wanted to investigate mathematics performance, so they first looked at differences in growth percentile distribution by grade. They were excited by what they saw at the 6th grade, especially: statewide in 2009, 40% of 6th grade students demonstrated high or very high growth; in Lowell 55% of 6th graders had high or very high growth in 2009. They wanted to unpack this data – so they looked at the data by school. Lowell Public Schools 35 35

36 They found tremendous variation among schools in Grade 6 performance: at the Pyne, only 6% of all students had low or very low growth; at the Bartlett only 24% of 6th graders grew at a high or very high rate. What especially caught their eye, though, was performance at the Stoklosa Middle School. While Pyne, Wang and Daley serve a a heterogeous mix of students, Stoklosa serves the poorest section of the city. Over 90% of its students are poor; 47% are English Language Learners. Yet 44% of 6th graders had grown at a very high rate and another 27% at a high rate, a total of 71% of students with SGP’s over 60. Lowell Public Schools 36

37 They wanted to know if this was a fluke. It wasn’t
They wanted to know if this was a fluke. It wasn’t. The year before 62% of Stoklosa’s 6th graders had grown at the high or very high rate. Note, too, that the Bartlett’s 6th graders the year before had grown at a faster rate than the state: 55% had grown at the high or very high rate.

38 Lowell officials wanted to dig deeper, so they disaggregated the data for Stoklosa’s 6th graders. They discovered that their ELL students’ MCAS growth was high. They dug deeper. Lowell Public Schools 38

39 Disaggregating results for regular education and ELL students by grade led to new discoveries:
Very high growth for both regular education and ELL students was 2 to 3 times higher than the state at all grade levels except 7th grade. High and very high growth was exceptionally strong for both 5th and 6th grade students, and considerably lower for 7th grade students. Lowell Public Schools 39

40 Other measures that can help Stoklosa staff assess impact on student learning and growth
Massachusetts English Proficiency Assessment (MEPA) District math benchmark assessment District-approved commercial assessment tied to district curriculum District-adopted curriculum-embedded performance assessment MCAS Growth Percentile Scores alone can’t tell the full story. Stoklosa staff need to turn to other measures of student learning to help them understand what’s going on and what they might do differently to get different results. They have options: The state assessments of English Language proficiency which show growth year-to-year for English Language Learners (MEPA) Results from the district’s own interim benchmark assessments for math Maybe Lowell has one or more commercial assessments tied to the curriculum that they use. And Lowell is probably working now on some district-wide curriculum-embedded performance assessments for math, or will wait for the models ESE plans to develop as part of the Teaching and Learning System being built with Race to the Top funds.

41 Agenda Goals of the Proposed Regulations
Key Features of the Proposed Regulations Components of the Model System Challenge: Assessing Educator Impact on Student Growth using Multiple Measures Challenge: Self-Assessment and Goal Setting Challenge: Timeline for Implementation

42 A 5-Step Evaluation Cycle
42

43 Self-assessment and Goal Setting
Based on: Standards & indicators (rubric) + district & school priorities An analysis of multiple measures of learning and growth of our students in the past An analysis of the students we have now At least: One goal for professional practice One goal for student learning, growth and achievement Attributes of a Useful Goal Specific Measurable Attainable Relevant Time-bound Self-assessment and Goal Setting are two ways the draft regulations aim to encourage active engagement of educators in their own evaluation. We are hopeful, too, that many districts will seize on the opportunity to strengthen their professional learning communities by encouraging self-assessment and goal setting at the team, grade, department and school level – especially for proficient and exemplary educators.

44 SMART goals of Stoklosa’s 7th grade team
Professional Practice goal: I/We will… Student Learning goal: My/Our students will…

45 SMART goals of a middle school music teacher
Professional Practice goal: I will collaborate with my colleagues in the music department to develop, pilot, analyze, revise and share 2 performance-based assessments Student Learning goal: My students will be able to identify and apply music terms, symbols and definitions in the curriculum guide for 6th, 7th and 8th grade. Using a department-developed assessment, 75% of my students will score 85% or above on the third quarter assessment.

46 SMART Goals of an 8th grade social studies teacher
Professional practice goal: To strengthen expository writing, I will study the “workshop process” for writing, observe it in practice, and introduce it in at least two of my classes by the start of second term. Student learning goal: At the end of the third quarter unit on the constitution, students will demonstrate proficiency by writing a pamphlet for new citizens about their constitutional rights. Using a department-developed rubric, a majority of my students will have moved one level on the writing component of the rubric since the start of the second term

47 SMART Goals of a Middle School Principal
Professional practice goal: I will complete 100% of goal setting conferences with my fifth and sixth grade teams by October 15th, seek anonymous feedback about staff perceptions of their usefulness in improving their practice, research effective goal setting with my colleagues, and identify steps I will take in mid-year formative assessment conferences to improve the likelihood that their practice and student growth goals will be achieved. Student learning goal: The proportion of fifth and sixth grade students with high or very high SGP growth will increase by 5% points in both ELA and Math.

48 SMART Goals of a 10th grade geometry teacher
Professional practice goal: To engage students more, starting second term, we will incorporate at least one real-world application of geometry into 2 of every 5 homework assignments. Student learning goal: 85% of our students will score 80% or above on the district-developed third quarter exam.

49 Agenda Goals of the Proposed Regulations
Key Features of the Proposed Regulations Components of the Model System Challenge: Assessing Educator Impact on Student Growth using Multiple Measures Challenge: Self-Assessment and Goal Setting Challenge: Timeline for Implementation

50 Anticipated timeline for Implementation
Level 4 schools + volunteer “early adopters” For All Race to the Top Districts For All other Districts

51 Educator Impact on Student Learning
603 CMR (4) By September 2013, each district shall adopt a district-wide set of student performance measures for each grade and subject that permits a comparison of student learning gains. MCAS Student Growth Percentile shall be employed where it is available. At least two measures of student learning gains shall be employed at each grade and subject in determining impact on student learning.

52 Priorities for ESE action and support?
Model contract language and rubrics for teacher and principal Strategies for “making time” to do evaluation well Orientation materials for many audiences Strategies for using the rubric to develop a shared, specific picture of practice at four levels of proficiency Access to low-cost PD for evaluators to use the rubric effectively Guidelines and examples: Self-Assessment and Goals Guidelines and examples: Educator Plans Developing District-wide Measures of Student Learning* Determining Educator Impact on Student Learning* Student, Staff and Parent Feedback* * Will take more than one year to develop, pilot and validate

53 If I were in your shoes…. This summer, I’d:
Begin (or deepen) work on building educator’s capacity to analyze data about student learning and set SMART goals. Work with my principals on their SMART goals and establish clear expectations for what I want to be seeing when I do my first visits to their schools this fall and observe practice with them. Begin to line up potential partners and supports; for example, I’d ask my collaborative if it will work with my district and other member districts on implementation. Introduce the first draft of ESE’s model rubric (available mid-July, hopefully) to see how well it might match my district’s needs. Attend the M.A.S.S. Summer Institute to get the latest information from ESE on the status of the regulations and the model; and to confer with colleagues

54 If I were in your shoes….(con’t)
I would not start bargaining now. There isn’t enough information to go on yet. I would, however, let my committee and union know that we will have to open the contract to bargain this.

55 Questions? Suggestions? Priorities? Please complete the feedback sheet
Karla Brooks Baehr Deputy Commissioner


Download ppt "Educator Evaluation: Challenges and Opportunities"

Similar presentations


Ads by Google