Presentation is loading. Please wait.

Presentation is loading. Please wait.

Educator Evaluation System

Similar presentations


Presentation on theme: "Educator Evaluation System"— Presentation transcript:

1 Educator Evaluation System
NEC and SEEM Workshop May 4, 2012

2 Presenters Donna Martinson, Teacher, Parker Middle School
Elisabeth Shanley, Teacher, Parker Middle School Joanne Fitzpatrick, Reading Memorial High School Helen Sellers, Killam Elementary School John Doherty, Superintendent of Schools

3 Feel free to ask questions throughout the presentation
Agenda 9:00 AM - 10:00 AM: An overview of the process for Board members and union representatives 10:00-10:15 A.M.: Break 10:15 AM - 12:00 noon: Guidance around the evaluation process and Smart Goals for administrators Feel free to ask questions throughout the presentation

4 RPS Educator Evaluation Wiki
Wiki with Resources

5 Let’s Take a Few Minutes
Take a few minutes to write down any burning questions that you may have in relation to the evaluation process from the lens of the collective bargaining process

6 Burning Questions

7 Burning Questions

8 Burning Questions

9 Agenda Discussion of Educator Evaluation Regulations
Comparison to Our TAP What is the same What is new How does this effect me as a teacher? Next steps in the process Questions

10 The Power of Teamwork and Collaboration

11 Every Beginning is Difficult

12 Educator Evaluation Model System
Massachusetts Department of Elementary and Secondary Education

13 Educator Evaluation New DESE Regulations approved on June 28, 2011
Collaboratively Designed by Massachusetts Teachers Association Massachusetts Association of Secondary School Principals Massachusetts Elementary School Principals Association Massachusetts Association of School Superintendents Department of Elementary and Secondary Education Requires evaluation of all educators on a license Designed to promote leaders and teachers growth and development Designed to support and inspire excellent practice

14 Reading is an Early Adopter
Our current system is comparable to new DESE model Allowed us to give significant input into the process Developed a network with other school districts Attended professional development opportunities Piloted Educator Plan with SMART Goals Superintendent’s Evaluation Process Principal Evaluation Process

15 TAP Committee A Key to the Process
Committee of Teachers, Building Administrators, Central Office Administrators Representation from every school Compared current rubric with model rubric system Reviewed model contract language Will be involved in development of forms for September, 2012

16 Components of System Focuses on Educator Growth and not “Gotcha”
Educators are partners in the process Five Step Evaluation Cycle Self-Assessment Analysis, Goal Setting, Educator Plan Development Implementation of Plan Formative Assessment (Midyear or Mid-cycle) Summative Evaluation (End of Year/Cycle Evaluation) Rubric for Evaluation Use of Artifacts for Evidence Lesson Plans, Professional Development Activities, Fliers Walkthroughs Announced and Unannounced observations Differentiated Approach New Teachers Non-PTS Teachers PTS Teachers PTS Teachers who need additional support Use of SMART Goals

17 Components of System Levels of Performance on Rubric
Exemplary (Exceeding the Standard) Proficient (Meeting the Standard) Needs Improvement (Progressing Toward the Standard) Unsatisfactory (Does not meet standard) Specificity of Rubric Standards Indicators Elements Four Standards Multiple Measures of Student Performance ( School Year) Use of student surveys ( School Year)

18 5 Step Evaluation Cycle Continuous Learning Every educator is an active participant in an evaluation Process promotes collaboration and continuous learning Foundation for the Model Slide 12: 5-Step Evaluation Cycle This graphic represents the framework in the regulations by depicting the 5-step process of continuous learning 2 additional key points to mention: One, the framework puts responsibility on both the educator and evaluator to complete the evaluation cycle Two, the length of the cycle is determined by an Educator’s plan, which varies by current career stage and previous year’s performance For example, teachers and administrators with PTS will typically be on a two-year cycle All other educators will typically be on a one-year cycle (next slide) Massachusetts Department of Elementary and Secondary Education 18

19 5 Step Evaluation Cycle: Rubrics
Part III: Guide to Rubrics Pages 4-5 Rubric is used to assess performance and/or progress toward goals Rubric is used to analyze performance and determine ratings on each Standard and Overall Every educator uses a rubric to self-assess against Performance Standards Professional Practice goals – team and/or individual must be tied to one or more Performance Standards Evidence is collected for Standards and Indicators; rubric should be used to provide feedback Now that we’ve reviewed the basic structure of the rubrics, let’s look at how they’re used during the 5-Step Evaluation Cycle. As you can see, rubrics are used at each stage. #1: Self Assessment: The rubric is a tool to guide self-assessment against the four performance standards. #2: Analysis, Goal Setting & Plan Development: Educators and evaluators use the rubric to help select specific goals that are aligned to the standards, indicators, and elements, and build an educator plan around the attainment of these goals. #3: Implementation of the Plan: The rubric is a critical tool for tracking progress and collecting evidence of practice, while also serving as a guide for the evaluator to provide feedback to the educator. #4: Formative Assessment/Evaluation: At the formative assessment/evaluation stage, the rubric is the primary tool evaluators will use to assess progress toward goals and to provide targeted FEEDBACK to the educator. #5: Summative Evaluation: The rubric provides the goal posts—what performance level has the educator achieved on each Standard based on the evidence of their practice, and what is their overall rating? 19 Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 19

20 Four Different Educator Plans
The Developing Educator Plan (Non-PTS Teachers and teachers new to a position) is developed by the educator and the evaluator and is for one school year or less. The Self-Directed Growth Plan (PTS Teachers) applies to educators rated Proficient or Exemplary and is developed by the educator. When the Rating of Impact on Student Learning is implemented (beginning in ), educators with a Moderate or High Rating of Impact will be on a two-year plan; educators with a Low Rating will be on a one-year plan. The Directed Growth Plan (PTS Teachers) applies to educators rated Needs Improvement and is a plan of one school year or less developed by the educator and the evaluator. The Improvement Plan (PTS Teachers) applies to educators rated Unsatisfactory and is a plan of no less than 30 calendar days and no longer than one school year, developed by the evaluator.

21 Goal Setting Process Focus-Coherence-Synergy
District Strategy Superintendent Goals School Committee School Improvement Principal Goals Plans Classroom Practice Teacher Goals Student Achievement

22 Standards, Indicators and Rubrics
Standards (4)-Required in Regulations Instructional Leadership (5 Indicators) Management and Operations (5 Indicators) Family and Community Engagement (4 Indicators) Professional Culture (6 Indicators) Indicators (20)-Required in Regulations Elements (32)-May be modified, but most keep rigor Rubrics A tool for making explicit and specific the behaviors and actions present at each level of performance.

23 Principals & Administrators
The framework establishes four standards of practice, with supporting rubrics defining four levels of effectiveness Principals & Administrators Teachers Instructional Leadership* Management and Operations Family & Community Partnerships Professional Culture Curriculum, Planning & Assessment* Teaching All Students* Family & Community Engagement * denotes standard on which educator must earn proficient rating to earn overall proficient or exemplary rating; earning professional teaching status without proficient ratings on all four standards requires superintendent review Revised 9/30/2011 Massachusetts Department of Elementary and Secondary Education

24 Model Rubrics: Structure
Part III: Guide to Rubrics Page 6 What do we mean by “vertical alignment?” Let’s start with the basic structure of the rubrics. HANDOUT: Teacher Rubric At-a-Glance/School-level Administrator Rubric At-a-Glance This handout gives you the basic outline of the teacher rubric and the school-level administrator rubric. Let’s take a look at the teacher side. Each rubric starts with the four standards listed across the top row. Under each standard are several indicators, listed in bold, that break the standard down into different components. And attached to each indicator are several numbered elements that describe the specific behaviors associated with those indicators. Specifically, “what does it look like?” The extent to which each standard, its indicators, and the elements are aligned with one another is “vertical alignment.” Let’s look at an example. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 24

25 Model Rubrics: Vertical Alignment within Rubrics
Example: Teacher Rubric Standard I “Standard I. Curriculum, Planning, and Assessment” Indicator B “Indicator I-B. Assessment” Elements 1 & 2 I-B-1: Variety of Assessment Methods I-B-2: Adjustments to Practice HANDOUT: Standards and Indicators of Effective Administrative Leadership Practice/Standards and Indicators of Effective Teaching Practice These are excerpts from two of the model rubrics: one from the teacher rubric, and one from the administrator rubric. Let’s look at the Teacher Rubric for the moment. This particular excerpt is from Standard I: Curriculum, Planning & Assessment, and Indicator I-B: Assessment. There are two elements attached to this Indicator listed in the far left-hand column. Take a moment and read the definition of the Indicator in bold at the top of the table, and then read the titles of the two elements. Using your pen, just underline the components of the definition that are reflected in the titles of the two elements. The extent to which the elements give meaning to the Indicator reflects the vertical alignment built into the rubric. Part III: Guide to Rubrics Appendix C, pages 2-4 Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 25

26 Model Rubrics: Structure
The next component of the model rubrics that lend rigor and comprehensiveness to them is what we call Horizontal Alignment. As you can see in the rubric excerpt, each element is broken down into four descriptors that correspond with the four performance levels ranging from Unsatisfactory to Needs Improvement to Proficient to Exemplary. Think of the descriptors for each element as helping you see “what it looks like.” Part III: Guide to Rubrics Page 6 Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 26

27 The Model Rubrics are Aligned
We have four model rubrics right now: Superintendent (district-level administrator), Principal (school-level administrator), Teacher and Specialized Instructional Support Personnel (educators who don’t have classes of their own but who consult with teachers and also work directly with children or young people, such as nurses, guidance counselors, and library/media specialists). Each rubric is very deliberately connected to the others. The next slide offers an example related to goal setting. 27 Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 27

28 Rubric Alignment, e.g., Goal Setting
Superintendent Rubric (I-D-1): Supports administrators and administrator teams to develop and attain meaningful, actionable, and measurable professional practice, student learning, and, where appropriate, district/school improvement goals. Principal/School-level Administrator Rubric (I-D-1): Supports educators and educator teams to develop and attain meaningful, actionable, and measurable professional practice and student learning goals. Teacher Rubric (IV-A-2): Proposes challenging, measurable professional practice, team, and student learning goals that are based on thorough self-assessment and analysis of student learning data. First, take a look at the descriptor for proficient performance in the school-level administrator rubric for Indicator D, Element #1 under Standard I: Instructional Leadership. Next is the descriptor for proficient performance in the district-level (superintendent) administrator rubric for the same element: I-D-1, Educator Goals. And finally, here is the descriptor for proficient performance in the teacher rubric. Note that it falls under a different standard and indicator – you can find it with the other responsibilities under Standard IV, Professional Culture, on the Teacher Rubric-at-a-Glance on the other side of your handout. Do you see the relationship among the three rubrics for goal setting? Please circle the words that differentiate the responsibilities each has in goal setting. 28 28

29 Alignment of Rubrics, e.g., Goal Setting
Here are the words I circled. Are they the same as yours? 29 29

30 Exemplary “The educator’s performance significantly exceeds Proficient and could serve as a model for leaders district-wide or even statewide. Few educators—principals and superintendents included—are expected to demonstrate Exemplary performance on more than a small number of Indicators or Standards.” Part III: Guide to Rubrics Page 14 Massachusetts Department of Elementary and Secondary Education

31 Proficient “Proficient is the expected, rigorous level of performance for educators. It is the demanding but attainable level of performance for most educators.” Part III: Guide to Rubrics Page 9 Massachusetts Department of Elementary and Secondary Education

32 Needs Improvement Educators whose performance on a Standard is rated as Needs Improvement may demonstrate inconsistencies in practice or weaknesses in a few key areas. They may not yet fully integrate and/or apply their knowledge and skills in an effective way. They may be new to the field or to this assignment and are developing their craft.

33 Unsatisfactory Educators whose performance on a Standard is rated as Unsatisfactory are significantly underperforming as compared to the expectations. Unsatisfactory performance requires urgent attention.

34 Curriculum, Planning, and Assessment Family and Community Engagement
Standard I: Curriculum, Planning, and Assessment Standard II: Teaching All Students Standard III: Family and Community Engagement Standard IV: Professional Culture A. Curriculum and Planning Indicator 1. Subject Matter Knowledge 2. Child and Adolescent Development 3. Rigorous Standards-Based Unit Design 4. Well-Structured Lessons A. Instruction Indicator 1. Quality of Effort and Work 2. Student Engagement 3. Meeting Diverse Needs A. Engagement Indicator 1. Parent/Family Engagement A. Reflection Indicator 1. Reflective Practice 2. Goal Setting B. Assessment Indicator 1. Variety of Assessment Methods 2. Adjustments to Practice B. Learning Environment Indicator 1. Safe Learning Environment 2. Collaborative Learning Environment 3. Student Motivation B. Collaboration Indicator 1. Learning Expectations 2. Curriculum Support B. Professional Growth Indicator 1. Professional Learning and Growth C. Analysis Indicator 1. Analysis and Conclusions 2. Sharing Conclusions With Colleagues 3. Sharing Conclusions With Students C. Cultural Proficiency Indicator 1. Respects Differences 2. Maintains Respectful Environment C. Communication Indicator 1. Two-Way Communication 2. Culturally Proficient Communication C. Collaboration Indicator 1. Professional Collaboration D. Expectations Indicator 1. Clear Expectations 2. High Expectations 3. Access to Knowledge D. Decision-Making Indicator 1. Decision-making E. Shared Responsibility Indicator 1. Shared Responsibility F. Professional Responsibilities Indicator 1. Judgment 2. Reliability and Responsibility

35 Example of Teacher Rubric
Standard I: Curriculum, Planning, and Assessment. The teacher promotes the learning and growth of all students by providing high-quality and coherent instruction, designing and administering authentic and meaningful student assessments, analyzing student performance and growth data, using this data to improve instruction, providing students with constructive feedback on an ongoing basis, and continuously refining learning objectives.

36 Example Indicator I-A. Curriculum and Planning: Knows the subject matter well, has a good grasp of child development and how students learn, and designs effective and rigorous standards-based units of instruction consisting of well-structured lessons with measurable outcomes.

37 Example Element A-1. Subject Matter Knowledge
Proficient-Demonstrates sound knowledge and understanding of the subject matter and the pedagogy it requires by consistently engaging students in learning experiences that enable them to acquire complex knowledge and skills in the subject.

38 Multiple sources of evidence inform the summative performance rating
This graphic explains the summative performance rating in new educator evaluation framework. The left column includes the three categories of evidence used in evaluation: Products of Practice (including observations, unit plans, schedules and the like), Multiple Measures of Student Learning (ranging from classroom assessments to MCAS Growth Percentile Scores and MEPA when available), and Other Evidence including, eventually, student feedback. (A note here: MEPA – the Massachusetts English Proficiency Assessment for English Language Learners – is being replaced by a better assessment being developed by a national consortium of states called WIDA (pronounced “WEE-DUH.”)) The middle column represents the translation of the evidence into an assessment of performance on each of four standards in addition to an assessment of progress on goals. The right column is the single summative performance rating. Starting from the left, using a rubric, an evaluator lines up evidence from the three categories on the left to determine a rating on each of the four Standards (for teachers, the four standards are Curriculum, Planning and Assessment, Teaching all Students, Family Outreach & Engagement, and Professional Culture) and an assessment of progress on both student learning and professional practice goals. One important note: Educators have to receive a rating of Proficient or Exemplary on the Standards having to do with curriculum and instruction (teachers) or, for administrators, instructional leadership, in order to be eligible to receive an overall summative performance rating of Proficient or Exemplary. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

39 Multiple sources of evidence inform the evaluation
Standards Products of Practice (e.g., observations) R U B I C Standard 1 Summative Performance Rating Exemplary Proficient Needs Improvement Unsatisfactory Outcomes for Educator: Recognition and rewards Type and duration of Educator Plan Multiple Measures of Student Learning Standard 2 Standard 3 Other Evidence (e.g. student surveys) Standard 4 Attainment of Educator Practice Goal(s) and Student Learning Goal(s) as identified in the Educator Plan Rating of Impact on Student Learning Low, Moderate, or High Trends and Patterns in at Least Two Measures of Student Learning Gains MCAS growth and MEPA gains where available; measures must be comparable across schools, grades, and subject matter district-wide Massachusetts Department of Elementary and Secondary Education Revised 9/30/2011

40 Educators earn two separate ratings
Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) District Decision Matrix # (3) District Decision Matrix # District Decision Matrix # (7) District Decision Matrix # (8) Massachusetts Department of Elementary and Secondary Education

41 Educators earn two separate ratings
Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) At Step 5 of the Cycle, I earn one of four Summative Performance Ratings, shown here on the left vertical column: exemplary, proficient, needs improvement or unsatisfactory. The Rating is based on my ratings on each of 4 standards and the progress I’ve made on my goals. Remember, my Educator Plan has had at least one professional practice goal and one student learning goal for me to pursue. That rating is not the only rating I will get. Look at the vertical row on the bottom. Eventually, all educators will earn a second, separate rating: a rating of low, moderate or high for their impact on student learning based on trends and patterns on state- and district-determined measures of student learning gains. This is Phase 2 of the Educator Evaluation system. These impact ratings will not be made in the school year. The impact ratings depend on identifying district measures of student learning for different subjects and grades and then getting results from those assessments for at least two years. Districts are going to be identifying measures during the school year based on guidance ESE will provide in June Then districts will start to administer the assessments in For most educators, two years of collecting data will mean that the first ratings of Educator Impact on Student Learning gains will not be made until the end of the school year. Together, the intersection of the two ratings will determine the duration and type of plan educators have. (A note: all non-professional teacher status teachers and administrators in their first three years have one-year plans, called Development Plans, regardless of what rating of impact they receive) Phase I is where goals come in. They are a focus of the 5-Step Cycle that is the backbone of Phase I. Based on: Rating of Performance on each of 4 Standards + Attainment of Goals Based on Trends and Patterns on state- and district-determined measures of student learning gains 41 Massachusetts Department of Elementary and Secondary Education

42 Phase-in Over Next 2 Years
Phase 1-Summative ratings based on attainment of goals and performance against the four Standards defined in the educator evaluation requirements (September, 2012) Phase 2-Rating of educator impact on student learning gains based on trends and patterns of multiple measures of student learning gains (September, 2013) Phase 3-Using feedback from students (for teachers) and teachers (for administrators)-(September, 2014)

43 Three Different Initiatives? Or Just One?
Student and Teacher Growth Educator Evaluation Common Core Common Assessments

44 Example of Three Initiatives In One
Common Core For Literacy has three expectations Building knowledge through content rich non-fiction and informational texts Reading and writing grounded in evidence from text Regular practice with complex text and its academic vocabulary Goal setting would be focused on Increasing the amount of non-fiction and informational text used in the classroom Increasing the amount of writing that focuses on using evidence from text Increasing student engagement by using quality questioning techniques.

45 Examples of Three Initiatives in One Continued
Classroom Observations Focus On Engaging Students Directly with High Quality Texts Quality of Questions and Instructional strategies used to engage students with a high level of key academic vocabulary Assessing Student Work through Evidence of Speaking and Writing Common Assessments Could Focus On MCAS/PARCC Student Analytic Writing which shows growth over time Student presentations which shows evidence of drawing information from texts over time

46 Individual Reflection and Discussion
What will implementation of educator evaluation regulations allow you to do that is really important to drive instructional improvement and student learning in your system? -5 minutes for individual writing; 10 minutes for table discussion

47 Reflection Questions How does education evaluation relate to your strategy? To what extent do different people in the organization (principals, teachers, school board members, community members) understand the relationship of education evaluation to your strategy and to realizing what you think is most important to drive instructional improvement and student learning? What’s your evidence for your assessment? What are a couple of things you can do in the near term to help everyone in the system think about evaluation relative to larger goals for that work and system strategy?

48 How to Engage Educators
NEC and SEEM Presentation

49 Engaging Educators Framework Source: Reform Support Network
Four Domains of Educator Engagement I know I apply I participate I lead Each domain expects levels of mastery and involvement and different habits of mind. We must intentionally engage educators across all four of the domains.

50 A Framework for Engaging Educators
I Know I Apply I Participate I Lead

51 I Know I know how the evaluation system in my district works. I also know the rationale for the changes in policy. I understand the observational framework used to assess my performance and I understand how it intersects with student growth measures. I understand the rating system and how my rating information leads to different types of educator plans. I know to whom I can turn for support in order to improve. In short, the evaluation system is a set of clear signals I use to guide the improvement of my performance.

52 Strategies for “I Know”
All stakeholders (SEA, LEA, Union) are responsible Develop feedback loops for misconceptions Surveys, Focus Group Sessions Communicate, Communicate, Communicate Guidebooks FAQ Website Newsletter Information Sessions Podcasts/Webinars Train the Trainer Models

53 I Apply I apply what I know about the evaluation system to improve my practice and get better results with the students I teach. I think through the expectations of the observation rubrics and apply those expectations to the design of my lesson plans. I also use the information for other measures of student growth, to set expectations for my students, and to decide how to differentiate instruction. I use feedback from observers and consider my strengths and weaknesses as a practitioner. I use student data and other forms of feedback to assess my own performance and consider what to do to continue improving the results I get with my students.

54 Strategies to Support “I Apply”
Make resources and tools available for educators to use Model lesson plans aligned to standards Instructional coaching Mentoring Professional Development Interim Assessments Videos of high quality instruction

55 I Participate I participate in the development, implementation and refinement of my district’s teacher evaluation system at both the practical and policy levels. At my school, I work with leaders and colleagues to set shared expectations for how evaluations will be conducted. I collaborate with others to review the observation rubric so we can understand what it means for us. I work with my colleagues to interpret student data to inform instructional decisions. As a member of my union, I participate in union-management collaborative sessions to calibrate video teaching samples using the observation rubric. I work with union and district leadership to reflect how the new system will change the way my colleagues and I will use our time in my school.

56 Supporting “I Participate”
Feedback Loops Surveys that gauge frequency and quality of feedback Focus Group Sessions Follow up on Feedback Joint Union/Administration Communication Teams Breaks down barriers and eliminates misconceptions Identify teachers for additional roles and responsibilities Peer Observation Pilot Developing assessments for multiple measures Tools and guidance with student learning objectives

57 I lead I lead my colleagues to improve their performance and to improve the evaluation system as we go forward. I am recognized as an excellent practitioner, whose classroom performance and student growth results stand out. At my school, my principal and colleagues seek me out for my expertise. I open my classroom as a demonstration site, and I am called upon to deliver model lessons. I mentor new teachers and support other teachers as they develop. At the district level, I collaborate with leaders from other schools, the union and district administration to improve the faculty’s understanding of how to improve the evaluation system. With other leaders, I visit schools around my district and help others know, apply, participate, and lead. I make sure that things are done with teachers, not to them.

58 Supporting “I Lead” Identify excellent practitioners and give them opportunities to lead Study groups which focus on particular evaluation standards or development of assessments Participate on school/district evaluation advisory committees Establish a culture that accommodates disagreement, but does not accept the status quo

59 SMART Goal Development
How many of you are familiar with SMART goals and use them in some way? What you know is not “wrong.” But we’re asking you to give it up and consider adopting the MA SMARTer Goal Model – a particular way of developing and using SMART goals. Why? The problem with SMART Goals is that since the 1981 business journal article that introduced the concept, the term “SMART goal” has been adopted to describe lots of different approaches to goal setting. At this point, everyone has a different picture of what they are and how to write them. The advantage of having a shared, statewide approach to developing SMART goals is that we can develop some exemplars for Massachusetts schools that can serve as starting points; we can share them across schools and districts; we can help each write better and better SMART goals; and we can agree on a format that fits perfectly into the Educator Evaluation framework where: the educator proposes goals, the team considers adopting them and tries to tries to “tweak” them to make them fit the team not just a single individual, the supervisor tries to make them “smarter” still, the educator gathers and presents evidence about progress toward the goals, and the evaluator determines an overall summative rating based, in part, on attainment of the goals. Having a shared language for SMART Goals will help us work smarter. So for the rest of this session, take what you know about SMART goals out of your head to make room for this particular approach to them. See if it works for you. NEC and SEEM Workshop Massachusetts Department of Elementary and Secondary Education

60 What Makes a Goal “SMART”?
Read the two pages on your own (about 5 minutes): By the end, underline one sentence, one phrase and one word that you think are particularly significant (Make notes along the way) You will need Handout 5: What Makes a Goal “SMARTer?” Massachusetts Department of Elementary and Secondary Education

61 What Makes a Goal “SMART”?
In groups of 6-8 people: Round #1: share the sentence; mark them. Round #2: share the phrase; mark them. Round #3: share the word; mark them. Discuss why each of you chose the phrase you chose and any new insights you gained from hearing your colleagues’ reasons for choosing the phrase they chose. Identify one phrase to share with the larger group. Identify the person at your table wearing the most blue. That person is going to be the reporter for this activity. Massachusetts Department of Elementary and Secondary Education

62 A Massachusetts “SMARTer GOAL” =
A Goal Statement + Key Actions Benchmarks (Process & Outcome) = The Heart of the Educator Plan In our model of the Massachusetts SMARTer Goal, there are 3 components. Taken together, the goal statement, key actions and benchmarks are the heart of the Educator Plan. Massachusetts Department of Elementary and Secondary Education

63 Step-by-Step with the MA “S.M.A.R.T.er” Goal Model
Step #1: Use data to identify goal area Step #2: Identify relevant elements from rubric Step #3: Focus on essential parts of elements Step #4: Draft the Goal Statement Step #5: Add Key Actions and Benchmarks I want to model how to build a SMARTer Goal using the example from earlier about my classroom management problems back in Methuen all those years ago. You will need Handout 6: Step-by-Step with the Massachusetts “S.M.A.R.T.er Goal” Model (An example about Classroom Management). Read through the five steps on your own. Now let’s walk through the first four steps in a kind of “think-aloud” With a partner, study the goal statement: assess where it’s strong or weak in terms of S (specific, strategic), M (measurable), etc. Now look at the Key Actions and the Benchmarks: do they shore up the weak letters? Enough? This appears to be a SMART enough goal: if the teacher does the actions he’s described and has the gather the evidence called for by the process and outcome benchmark, it’s pretty likely he would finish the year with much stronger classroom management and better student learning. That said, it can be made SMARTER: with a partner, pretend that one of you is the teacher and one of you is Carl, the teacher’s social studies department head. Carl is going to work with the teacher to “smarten up” this goal with another or a different key action and another or a revised benchmark that will make it even more likely that committing to this goal will make a differences in classroom management and student learning. Share out. Massachusetts Department of Elementary and Secondary Education

64 The S.M.A.R.T.er Goal Process: an iterative process
Revise goal statement, key actions and benchmarks as needed BUT…….. Don’t obsess! Your suggestions will make this goal smarter. That said, you want to avoid getting so focused on making the goal smarter and smarter that you lose too much valuable time when you – as department head and teacher – can be working on accomplishing the goal. So here is some advice. Massachusetts Department of Elementary and Secondary Education

65 Guided Practice #2: A superintendent’s meetings
Proficient Performance on IV-A-3: Plans and leads well-run and engaging administrator meetings that have clear purpose, focus on matters of consequence, and engage participants in a thoughtful and productive series of conversations and deliberations. Establishes clear norms for administrator team behavior. Let’s return now to the earlier example of learning to run better meetings for the administrators in Lowell. Here is an excerpt from the Superintendent Rubric. It describes proficient performance for Standard IV, Indicator A, Element #3: Meetings. There’s a comparable one for principals/school-level administrators. For our second guided practice together, we are going to examine a sample superintendent goal that has to do with getting better at running these kinds of meetings. You will need both Handout 5: What Makes a Goal SMARTer? plus Handout 7a: A Sample Professional Practice Goal for a Superintendent. Massachusetts Department of Elementary and Secondary Education

66 Guided Practice #2 The goal statement: is it S.M.A.R.T.?
The key actions: Is each one tightly linked to the goal? What is missing to ensure effective implementation? The benchmarks: Is there a process benchmark? (track actions done?) Is there an outcome benchmark? (track results achieved?) Although it’s written here as a superintendent goal, it could easily be “tweaked” to become a goal for a principal or director. This similarity is deliberate—it’s likely that both the superintendent and everyone who sits in the superintendent’s meetings would want the superintendent to be successful with this particular goal! Here is what we are going to be doing together as we unpack the Sample Superintendent Goal. Please read through the Handout 7a: Sample Professional Practice Goal for a Superintendent – the goal statement, key actions and benchmarks. Read it on your own. When you and your partner are done, with page one of Handout 5: What Makes a Goal SMARTer? in one hand, and the Goal Statement in the other, mark with the appropriate letter (S M A R T) words or phrases in the goal statement that are “specific or strategic;” “measurable;” “action-oriented;” “rigorous, realistic or results-focused,” and “timed.” Where you can’t find a word or phrase in the goal statement to match the SMART criteria, see if you can find it in a key action or benchmark and note it there. Massachusetts Department of Elementary and Secondary Education

67 Guided Practice #2: Sample Superintendent Goal
Goal Statement: During , I will devote at least 75% of administrative meeting time to district improvement goals and get better at using appropriate strategies to actively engage administrators in developing and sharing ways to implement those goals effectively at the school level. Let’s go through this goal statement together applying S.M.A.R.T. using the definitions in Handout 5: What Makes a Goal SMARTer? Strategic? “district improvement goals” + “developing and sharing ways to implement those effectively” Specific? “administrative meeting time” + “district improvement goals” + “strategies to actively engage administrators” + “developing and sharing ways to implement” Measurable? = “at least 75%” + “getting better at using appropriate strategies” (may be too “squishy” – it’s measurable enough only if you can identify a benchmark that permits you to measure over the course of the year the quantity of the strategies, quality of the strategies, and/or the impact of the strategies – is there a benchmark that does that? Which one?) Action-Oriented? =“ I will devote time…” + “I will get better at using appropriate strategies…” Rigorous, Realistic & Results-Focused? = Results-focused (administrative meetings that actively engage administrators in developing and sharing ways to implement district improvement plans effectively at the school level). Realistic (75% given emergencies? budget?). Rigorous? Timed? = “During ” To fully assess “T” (timed and tracked) we need to look carefully at the action steps and benchmarks. Massachusetts Department of Elementary and Secondary Education

68 Guided Practice #2 Here are two changes you might consider making to the goal statement: First, I changed the word “to” to “that” to address the major weakness with this goal statement: it is not clear enough that the strategies have to have the result of actively engaging administrators. Second, I added a phrase at the beginning to make clear what the strategic purpose this goal is intended to have: to help make sure that district goals are implemented in schools more effectively. This phrase “in order to” is a useful one to consider including in lots of goal statements because it helps make explicit what the strategic purpose of the activity is. Just a note here: we can easily make the mistake of proposing a goal that is really just an activity devoid of any substantial purpose, similar to what some of us have done in our teaching: we do an activity because it’s fun or engaging but it may not be tied to any substantive purpose beyond that. The same thing can happen with leaders and leader’s goals. Massachusetts Department of Elementary and Secondary Education

69 Guided Practice #2 The goal statement: is it S.M.A.R.T.?
The key actions: is each one tightly linked to the goal? what is missing to ensure effective implementation? The benchmarks: is there a process benchmark? (actions done?) is there an outcome benchmark? (results?) Key Actions: Looking carefully at the key actions, let’s focus in on the A(action) and R (realistic). They set the stage for good process benchmarks. So looking at the four Key Actions in the sample superintendent goal for meetings: is each tightly linked with the goal? How? Anything important missing? Suggestion: this summer, read Skillful Teacher to identify broader repertoire of strategies for active engagement. Benchmarks: Benchmarks get at the “tracked” part of the T in SMART. They are also where the rubber hits the road in terms of the R: results. Does this process benchmark track one or more critical action? Is there a different one or another one that would help this superintendent to track progress on doing what s/he said s/he needed to do to achieve this goal? How about this outcome benchmark? Is it a good, practical and necessary way to track results (outcomes)? Will it tell us if the goal of more meeting time devoted to district goals and better and better strategies for actively engaging administrators has really happened? Is there a measure of quantity, quality and/or impact? Quantity? Quality? Impact? Is there a new or better one you’d add? Massachusetts Department of Elementary and Secondary Education

70 What’s really “new” here: professional practice goals
Student learning and school/district improvement goals are not “new” to us; developing them as MA “SMARTer” goals with goal statement, key actions, and process/outcome benchmarks is pretty new What’s really new are professional practice goals in which educators have to be explicit about what we’re going to get better at, not just what we are going to do. Most of us have had experience with student learning, school improvement and district improvement goals. Fewer of us have had experience writing goals in ways that will help us achieve them. That’s what the MA Model for SMART goals is designed to do by having the SMARTer goal have a goal statement, key actions, process benchmarks and outcome benchmarks. Those parts – goal statement, key actions, and benchmarks are the components of the Educator Plan. What’s really new, though, are the professional practice goals. With professional practice goals we have to be explicit about the learning and practice we’re undertaking – what we’re going to get better at doing – and how we’re going to go about getting better and what evidence we can have for whether we’ve gotten better and how much better we’ve gotten. That’s new. We’re making explicit that we often have to learn and practice things in order to do other things well: to have better classroom management, we have to study and practice classroom routines and rituals and ask for and get feedback from observations; to have meetings that actually help participants to implement district goals better, we have to learn how to plan meetings differently, learn new strategies for leading meetings and for assessing what participants get out of our meetings. to conduct school or classroom observations that actually have an impact on leading and teaching, we need to learn how to observe differently and with more focus and insight, and how to write and communicate more useful and meaningful feedback in ways that leaders and teachers can “hear and understand.” Building professional practice goals into the heart of our educator evaluation system makes explicit the one thing we have to know, acknowledge and celebrate about our work: we are all learners and we have a responsibility to continue to learn and apply that learning to our work effectively. Massachusetts Department of Elementary and Secondary Education

71 Guided Practice #3: A Principal’s Observations and Feedback
Goal Statement for Classroom Observation & Feedback: I will manage my time more effectively in order to increase the frequency and impact of classroom observations by learning how to do 10-minute observations and conducting eight visits with feedback per week, on average. You will need Handout 8a: A Sample Professional Practice Goal for a Principal for Guided Practice #3. We will do a sample principal goal this time. Here is the Goal Statement this principal has written for classroom observation and feedback. Again, on a topic that will be of interest to both principals and central office administrators. Principals as both evaluators and evaluatees. Central Office administrators as evaluators, certainly – how about as evaluatees, too? Why? Read through the goal statement first: What are its strengths? What are its potential weaknesses (recognizing that some may be addressed in the key actions and benchmarks)? Suggested discussion points: S = frequency and impact of 10-minute classroom observations (“impact” on whom?). M = 8 visits with feedback. Is “impact” measurable? A = manage time…conducting R = ? T = NOT EXPLICIT WHEN THE 8-PER WEEK WILL HAPPEN AND BY WHEN GREATER IMPACT WILL HAVE HAPPENED. With a partner, how would you revise the goal statement? Share 71 Massachusetts Department of Elementary and Secondary Education 71

72 Guided Practice #3: A Principal’s Observations and Feedback
Goal Statement for Classroom Observation & Feedback: I will manage my time more effectively in order to increase the frequency and impact of classroom observations by learning how to do 10-minute observations and by the start of second semester conducting eight visits with feedback per week, on average, that an increasing percentage of teachers report are useful beginning with at least 60%. Here are two additions you might consider making to the goal statement to take this goal statement beyond being an activity of only “doing observations” to holding oneself responsible for making sure they have an impact, and that as the principal gets better, the impact grows. 72 Massachusetts Department of Elementary and Secondary Education 72

73 Guided Practice #3 In pairs:
Review the key actions and benchmarks: is anything important missing? Identify two revisions and/or additions to the actions and/or benchmarks that will make this SMART Goal “S.M.A.R.T.er” Share Added Key Action: February 1st, I will observe at least 5 classes with a principal colleague and/or the superintendent and use their feedback to improve my classroom observation and feedback skills. Added Benchmark (Outcome): Analysis of teacher feedback demonstrates at least 75% of teachers see the observations and feedback as useful. Massachusetts Department of Elementary and Secondary Education

74 An Example NEC and SEEM Workshop

75 Self-Assessment: Three Parts
Each educator shall be responsible for gathering and providing to the evaluator information on the educator's performance, which shall include: an analysis of evidence of student learning, growth, and achievement for students under the educator's responsibility; an assessment of practice against Performance Standards; and proposed goals to pursue to improve practice and student learning, growth, and achievement. The educator shall provide such information, in the form of self-assessment, in a timely manner to the evaluator at the point of goal setting and plan development. 75

76 Isaac Foster Analysis of Student Learning Needs
School Only 50% of the 6th, 7th, and 8th grade students read at grade level Team 40% of our team’s incoming 8th grade students read at least 2 grade levels below 8th grade. 25% of them read at or below the 3rd grade level Classroom 3 students are repeating the 8th grade; 50% have IEPs, 20% are ELLs The majority of students report not enjoying reading, finding it frustrating and a waste of time. This frustration and these struggles carry over into content areas, making access to texts in science, history, and mathematics difficult. 76

77 Massachusetts Department of Elementary and Secondary Education
Prioritizing Consider… District, School, or Team Goals Connection between student learning needs and areas for professional growth Timeline Focusing in on a particular Indicator or group of related Elements Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

78 Two Types of Goals In Regulations - 35.02
Student Learning Goals: “specified improvement in student learning, growth, and achievement” Professional Practice Goals: “educator practice in relation to performance standards, educator practice in relation to indicators” 78

79 A Not-So-SMART Goal (Team Goal, Professional Practice)
We will create reading comprehension formative assessments and analyze the resulting formative data. Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 79

80 SMART Goal Rewrite (Team Goal, Student Learning)
100% of the 8th grade team’s students will advance 1-2 reading levels by the end of the first semester, as measured by the reading comprehension scores on the DRA-2, so that by the end of the school year all students have advanced 2 or more reading levels in reading comprehension. Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 80

81 A Not-So-SMART Goal (Team Goal, Professional Practice)
We will create reading comprehension formative assessments and analyze the resulting formative data. Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 81

82 SMART Goal Rewrite (Team Goal, Professional Practice)
Beginning in September, the Language Arts Department will create monthly reading comprehension formative assessments so that 100% of the ELA teachers are using them monthly, analyzing the resulting formative data, and modifying instruction based on those results. Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 82

83 Create a “Through Line” Across Goals
How can I manage my professional growth District Goals School Professional Practice Goal(s) Student Learning Educator Evaluation 83

84 Isaac Foster Student Learning SMART Goals
School Goal 80% of our students will all read at or above grade level by the end of the school year 8th Grade Team Goal 100% of the 8th grade team’s students will advance 1-2 reading levels by the end of the first semester as measured by reading comprehension scores on DRA-2 Individual Goal Based on survey results, the % of my students reporting they enjoy reading will increase by 10% each quarter so that by the end of the year there is a 40% overall increase Massachusetts Department of Elementary and Secondary Education

85 One of Isaac’s Proposed Professional Practice Goals
During my daily lessons, I will implement strategies from the August 2011 district PD session on how to refine questioning. These questions will be captured in my lesson plans and reflection notes so I can get peer feedback from the ELA coach and my colleagues. Is it aligned with his self-assessment and student learning outcomes goals? Is it a SMART goal? TASK: rewrite Isaac’s goal Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 85

86 SMART Goal Rewrite (Individual Goal, Professional Practice)
During my daily lessons, I will implement strategies from the August 2011 district PD session on how to refine questioning. By the end of the first semester, 60% of my students will respond to at least two higher order thinking questions (based on Bloom’s taxonomy) at the evaluation, synthesis and/or analysis levels each class period. These questions and the responders will be captured in my lesson plans and reflection notes so I can get peer feedback from the ELA coach and my colleagues. Specific and Strategic Measurable and Monitored Action Oriented and Agreed Upon Realistic and Results Oriented Time-Bound and Tracked 86

87 Wrap up: Goal Statement “starters”
In pairs, First, review Sample School or District Goal Statements; identify: District/School Improvement Goal Statements Student Learning Goal Statements Professional Practice Goal Statements Next, identify which could be TEAM goals? Finally, choose one to make “SMARTer” back in your school or district Let’s take a look again at Handouts 7a: A Sample Superintendent Goal and 8a: A Sample Principal Goal. This time look at the side of the Superintendent Goal sheet that has sample district goal statements. The Sample Principal Goal Sheet has sample goal statements for school goals. Both have samples for improvement goals, student learning goals, and professional practice goals. 1. Review either the school or district goals in pairs. Mark each goal statement: is it the start of an improvement goal, a student learning goal, or a professional practice goal statement? 2. Which might be TEAM goals in your school or district? 3. For please choose one of the beginning goal statements to work on back in your district or school OR if none seem urgent enough for you in your context, identify a different need you’ve already identified for your school or district. Notes on debrief: 1 - there’s overlap among the categories, especially between student learning goals and school/district improvement goals 2 – remember that professional practice goals are not just activities: they must have a purpose AND they have to involve you getting better at something/learning and applying something new Massachusetts Department of Elementary and Secondary Education

88 Your “Homework” 1. Back in your district, with your partner:
Refine the goal statement you chose to your context OR Develop another one Draft 3 key actions Draft 1 process benchmark Draft 1 outcome benchmark 2. Exchange your draft SMARTer Goal with another pair 3. Work together to make each draft SMARTer so you can use the revised SMARTer Goal as one of the goals you propose to your evaluator for To get you ready to continue this work back in your schools and district offices, start by proposing a plan that will result in your having at least one SMART goal to propose to your evaluator for Your homework will be to work with a partner to refine the goal statement to make it work for your context (or draft one for the alternative topic you’ve chosen), identify at least three key actions, one process benchmark and one outcome benchmark. Set a deadline and then exchange draft SMARTer Goals with another pair from your district and work together to make the first draft SMARTer. Massachusetts Department of Elementary and Secondary Education

89 Next Steps for Reading Collective Bargaining Process for Areas Not in Regulations Meeting with individual schools to discuss process further Training for Primary and Secondary Supervisors on Process and Calibration of Rubric TAP Committee Summer Work New Forms Planning professional development opportunities September Inservice SMART Goal Development

90 or “An” initiative? “The” organizing initiative?
Meaningful implementation of educator evaluation has the potential to significantly impact the professional growth and development of teachers, administrators, and other professional support staff. However, it is only one of many initiatives that schools and districts are engaged in. It would be a missed opportunity to implement evaluation as “one more initiative” rather than using it as a lever to strategically organize and advance the other work that must be simultaneously accomplished. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

91 Examples of District/School Initiatives
Adopting the new MA Curriculum Frameworks 21st Century/Global Skills Anti-Bullying Professional learning communities Examining student work Data Teams Project Based Learning Common course/grade level assessments Elementary Report Cards Social Emotional Health BYOD This list represents a sample of initiatives that districts and schools have shared with ESE staff as high priorities for implementation right now. Others can certainly be added to the list, and the relative importance of each of these initiatives varies across local contexts. Evaluation could be added as another bullet to the list – OR it could be utilized as a framework that adds meaning and context to the implementation of evaluation and helps organize the implementation of these other initiatives. As noted earlier, the rubrics drive much of the evaluation process and help to identify areas of emphasis and priority. HANDOUT: Teacher Rubric At-a-Glance/School-level Administrator Rubric At-a-Glance If you look at the handout of the “School-Level Administrator Rubric At-A-Glance,” you see that the Standards, Indicators, and Elements encompass the activities associated with the implementation of the initiatives listed above. For example, “Adopting the new MA Curriculum Frameworks” is directly connected to Element 1-A-1, “Standards-Based Unit Design.” “Anti-Bullying” is directly connected to Element II-A-3, “Student Safety, Health, and Social and Emotional Needs.” To increase the coherence with which a variety of initiatives are implemented, there should be intentional overlap of the evaluation work and other school/district efforts. A meeting to “unpack the rubric” that is focused on Standard I, “Instructional Leadership,” should be informed by substantive content and information relative to other efforts, such as adopting the new curriculum frameworks. Conversely, a meeting to plan the work to adopt the new curriculum frameworks should also be taken as an opportunity to note the connection to the rubric and to communicate that Element I-A-1 of the administrator rubric (and Element I-A-3, “Rigorous Standards-Based Unit Design” of the teacher rubric) will be high priority elements for that school year. To differentiate this work at the school and district level, leadership teams can use their district’s rubrics as an organizational and communications tool by identifying such connections between elements and local initiatives/priorities and sharing that information with administrators and teachers schoolwide and/or districtwide. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

92 Some Final Thoughts As An Early Adopter
This may be the most important initiative that you undertake in your district Look at this as an opportunity to improve teaching and learning and educator growth in your district Plan your strategy and process Train staff on how to write and implement SMART goals Use your special education teachers as a resource for SMART Goal Development Collaboration is critical to the success of this implementation Link this system to the common core and assessment development Use the DESE materials Adopt the model rubric Transparent and ongoing open honest communication is critical Develop a logic model on how you will implement this process Involve your staff, school committee, and community early and often in the communication process

93 Questions and Thank You
Wiki with Resources

94 Thank You!

95 Reading Public Schools
Staff Meetings Reading Public Schools

96 5 Step Evaluation Cycle Continuous Learning Every educator is an active participant in an evaluation Process promotes collaboration and continuous learning Foundation for the Model Slide 12: 5-Step Evaluation Cycle This graphic represents the framework in the regulations by depicting the 5-step process of continuous learning 2 additional key points to mention: One, the framework puts responsibility on both the educator and evaluator to complete the evaluation cycle Two, the length of the cycle is determined by an Educator’s plan, which varies by current career stage and previous year’s performance For example, teachers and administrators with PTS will typically be on a two-year cycle All other educators will typically be on a one-year cycle (next slide) Massachusetts Department of Elementary and Secondary Education 96

97 5 Step Evaluation Cycle: Rubrics
Part III: Guide to Rubrics Pages 4-5 Rubric is used to assess performance and/or progress toward goals Rubric is used to analyze performance and determine ratings on each Standard and Overall Every educator uses a rubric to self-assess against Performance Standards Professional Practice goals – team and/or individual must be tied to one or more Performance Standards Evidence is collected for Standards and Indicators; rubric should be used to provide feedback Now that we’ve reviewed the basic structure of the rubrics, let’s look at how they’re used during the 5-Step Evaluation Cycle. As you can see, rubrics are used at each stage. #1: Self Assessment: The rubric is a tool to guide self-assessment against the four performance standards. #2: Analysis, Goal Setting & Plan Development: Educators and evaluators use the rubric to help select specific goals that are aligned to the standards, indicators, and elements, and build an educator plan around the attainment of these goals. #3: Implementation of the Plan: The rubric is a critical tool for tracking progress and collecting evidence of practice, while also serving as a guide for the evaluator to provide feedback to the educator. #4: Formative Assessment/Evaluation: At the formative assessment/evaluation stage, the rubric is the primary tool evaluators will use to assess progress toward goals and to provide targeted FEEDBACK to the educator. #5: Summative Evaluation: The rubric provides the goal posts—what performance level has the educator achieved on each Standard based on the evidence of their practice, and what is their overall rating? 97 Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 97

98 An example With this kind of alignment, think of what is possible in terms of moving district and school improvement efforts forward. Take, for example, what I [Karla Baehr] might have been able to do in Lowell had this framework been in place: ELL example: establish a district student learning goal of shortening the time it takes for ELLs to achieve fluency in English + a district professional practice goal of developing skills and knowledge in sheltering English and teaching academic vocabulary. Translate that to school and principal goals appropriate for each school’s context. Translate those to goals for the school leadership team and to appropriate goals for each grade level, department and other educator teams. And finally translate the goals in ways that make sense for individual teacher and specialist goals Imagine the improvement that could be made because of the power of concerted action. That said, we shouldn’t lose sight of the opportunity for “bottom-up” goal setting: results of teacher goals can inform the goals of teams, schools and districts moving forward; and/or the goals of the team can be informed by the individual goals, the goals of the school by the goals teams propose, and the district goals can be informed by the goals proposed at the school level. Massachusetts Department of Elementary and Secondary Education

99 99 Massachusetts Department of Elementary and Secondary Education 99
Bringing the rubrics and goals back to the 5-Step Cycle: let’s look at different example of a professional practice goal, again from Lowell: Running Administrative Meetings. Example: As I [Karla Baehr] moved from being superintendent in Wellesley with eight schools and few central office administrators to Lowell with its 24 schools and many specialized central office administrators (Title I, Bilingual Education), I realized I didn’t really know how to run meetings for 30, 50 or even 75 administrators that were engaging. I was used to sitting around a table with eight principals. Look through each step to see what I would have done had the five-step cycle been in place. 99 Massachusetts Department of Elementary and Secondary Education 99

100 What is the same? Focuses on Educator Growth and not “Gotcha”
Five Step Evaluation Cycle Self-Assessment Analysis, Goal Setting, Educator Plan Development Implementation of Plan Formative Assessment (Midyear or Mid-cycle) Summative Evaluation (End of Year/Cycle Evaluation) Rubric for Evaluation Use of Artifacts for Evidence Lesson Plans, Professional Development Activities, Fliers Walkthroughs Differentiated Approach New Teachers Non-PTS Teachers PTS Teachers PTS Teachers who need additional support Use of SMART Goals

101 What is different? Levels of Performance on Rubric
Exemplary (Exceeding the Standard) Proficient (Meeting the Standard) Needs Improvement (Progressing Toward the Standard) Unsatisfactory (Does not meet standard) Specificity of Rubric Standards Indicators Elements Four Standards instead of Six Fewer “Formal” Observations Multiple Measures of Student Performance ( School Year) Use of student surveys ( School Year)

102 Multiple sources of evidence inform the summative performance rating
This graphic explains the summative performance rating in new educator evaluation framework. The left column includes the three categories of evidence used in evaluation: Products of Practice (including observations, unit plans, schedules and the like), Multiple Measures of Student Learning (ranging from classroom assessments to MCAS Growth Percentile Scores and MEPA when available), and Other Evidence including, eventually, student feedback. (A note here: MEPA – the Massachusetts English Proficiency Assessment for English Language Learners – is being replaced by a better assessment being developed by a national consortium of states called WIDA (pronounced “WEE-DUH.”)) The middle column represents the translation of the evidence into an assessment of performance on each of four standards in addition to an assessment of progress on goals. The right column is the single summative performance rating. Starting from the left, using a rubric, an evaluator lines up evidence from the three categories on the left to determine a rating on each of the four Standards (for teachers, the four standards are Curriculum, Planning and Assessment, Teaching all Students, Family Outreach & Engagement, and Professional Culture) and an assessment of progress on both student learning and professional practice goals. One important note: Educators have to receive a rating of Proficient or Exemplary on the Standards having to do with curriculum and instruction (teachers) or, for administrators, instructional leadership, in order to be eligible to receive an overall summative performance rating of Proficient or Exemplary. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

103 Educators earn two separate ratings
Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) Summative Rating Exemplary 1-YEAR SELF-DIRECTED GROWTH PLAN 2-YEAR SELF-DIRECTED GROWTH PLAN Proficient Needs Improvement DIRECTED GROWTH PLAN Unsatisfactory IMPROVEMENT PLAN Low Moderate High Rating of Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile and MEPA where available) Today (and for the implementation of educator evaluation), we are just focusing on the Summative Rating for performance. The Impact Rating is not yet relevant; the Department will not be releasing guidance on district-determined measures until June 2012, and districts will use the school year to select their district-determined measures. Even then, since an Impact Rating requires trends and patterns from at least two years of data, the first student impact ratings for teachers won’t be available until after the year at the earliest. So today we are not considering the Rating of Impact on Student Learning, what we call Phase II of the Educator Evaluation Framework. Today we are just talking about the Summative Performance Rating, which is Phase I of the framework. Massachusetts Department of Elementary and Secondary Education

104 What Plan Will I Be On Next Year?
School Year School Year Non-PTS (Will be Non-PTS Next Year) Developing Educator Plan Non-PTS (Will be PTS Next Year) Self-Directed Growth Plan PTS on Year 1 of TAP Cycle Year 2 of Self-Directed Growth Plan PTS on Year 2 of TAP Cycle Self-Directed Growth Plan or Directed Growth Plan PTS New to An Assignment Developing Educator Plan or Self-Directed Growth Plan PTS on Year 1 of Alternative Evaluation Will Complete Year 2 of Alternative Evaluation, then new system in PTS on Additional Assistance Plan and will continue on it next year Directed Growth Plan PTS on Additional Assistance Plan and will not continue on it next year

105 Timeline Event Due Date
Evaluator meets with Educators in teams or individually to establish Educator Plans October 15 Educator Plans due to Evaluators October 30 Mid-cycle for 1 Year Educator Plans February 1 Evaluator completes summative evaluation report June 10 Educator signs Summative Evaluation Report June 15


Download ppt "Educator Evaluation System"

Similar presentations


Ads by Google