Presentation is loading. Please wait.

Presentation is loading. Please wait.

Educator Evaluation Reform in New Jersey November 16, 2012.

Similar presentations


Presentation on theme: "Educator Evaluation Reform in New Jersey November 16, 2012."— Presentation transcript:

1 Educator Evaluation Reform in New Jersey November 16, 2012

2 The Case for Reforming Teacher Evaluation Systems: Impact Nothing schools can do for their students matters more than giving them effective educators Principal and teacher quality account for nearly 60% of a school’s total impact on student achievement 1 The effect of increases in teacher quality swamps the impact of any other educational investment, such as reductions in class size 2 Replacing one poor teacher with an average one increases a classroom’s lifetime earnings by ~$266,000 3 2 1.Marzano et al., 2005 2.Goldhaber, 2009 3. Chetty et al., 2011 Top educators have a lasting impact on their students’ success – in academics and in life

3 3 Evolution of Evaluation Reform in New Jersey 2013-14 Statewide Implementation of New Evaluation System 2012-13 Cohort 2 teacher evaluation/new principal evaluation pilots in progress; districts building capacity New tenure legislation in effect 2011-12 Teacher evaluation pilot in progress Capacity-building requirements announced for all districts to follow in 2012-13 2010-11 NJ Educator Effectiveness Task Force work Teacher evaluation pilot opportunity announced

4 2012-13 Teacher Evaluation Pilot Weights 4 Teaching Practice (TP) includes the following components, totaling 50% of the pie: Teaching Practice Evaluation Framework (40% - 45%) Other Measures of Teaching Practice (5% - 10%) Student Achievement (SA) includes the following components, totaling 50% of the pie: Growth on NJ Assessments as measured by SGP (35% - 45%) School-Wide Performance Measure (5%-10%) Other Performance Measures optional (0% - 10%) Tested Grades and Subjects – equal weighting Teaching Practice (TP) includes the following components, totaling 50-85% of the pie: Teaching Practice Evaluation Framework (45% - 80%) Other Measures of Teaching Practice (5% - 10%) Student Achievement (SA) includes the following components, totaling 15-50% of the pie: Student Achievement Goals (10% - 45%) School-Wide Performance Measure (5%-10%) Non-Tested Grades and Subjects – variable weighting (districts have discretion) Districts determine how much of remaining 35% of pie is allocated to TP and/or SA

5 2012-13 Teacher Evaluation Pilot: Changes from First Cohort Flexibility in minimum duration for classroom observations Fewer required observations for teachers of non- core subjects Use of double- scoring Unannounced observations Use of external evaluators Flexibility in weighting for tested and non-tested grades and subjects Based on learning from 2011-12 pilots and national best practices, the 2012-13 pilot includes: 5

6 Importance of Training 6 “Training of evaluators is key! Training was ongoing and included an eclectic approach: whole group that included teacher leaders, coaching by Superintendent, and instrument provider. Ongoing debriefing and double-scoring for training purposes were key strategies to support the learning of all administrators. The alignment between curriculum, lesson planning, assessment was essential in guiding our work.” Many districts are also using turnkey training to save time and money, and to engage educators in the process Cohort 1 Survey Response

7 Sources of Feedback State Evaluation Pilot Advisory Committee (EPAC) provides recommendations on pilot and statewide implementation Each pilot district convenes District Evaluation Advisory Committee (DEAC) – DEACs meet monthly to discuss pilot challenges, provide feedback – Districts convene one DEAC to cover both teacher and principal evaluation work External evaluator (Rutgers for 2011- 12) studies pilot activity and provides reports 2012-13 Evaluation Pilot Feedback Loops Outcomes Assess impact of new observation and evaluation protocols Convey best practices and lessons learned for rest of the State Inform proposed regulations for 2013-14 and subsequent school years 7

8 DEAC Impact: Cohort 1 Pilot Survey 8 “Having a balanced representation of parents, teachers, administrators, and community members has allowed us to address the needs and ideas from every stakeholder in the district. Parents were able to cite students' positive reaction to the evaluators in the classroom. Parents were happy to know that we were using student achievement a part of the teacher evaluation system. The DEAC process enabled stakeholders to share information.” Cohort 1 Survey Response

9 DEACs: Required for ALL New Jersey Districts 9 Teachers from each school level represented in the district School administrators conducting evaluations (this must include one administrator who participates on the School Improvement Panel and one special education administrator) Central office administrators overseeing the teacher evaluation process Supervisor Superintendent Parent Member of the district board of education Solicit input from stakeholders Share information Guide and inform evaluation activities Generate buy-in Districts must convene DEACs by October 31, comprised of the following: The mission of the DEAC is to:

10 School Improvement Panel: Required for all Districts 10 Charge Establish in each school in district by February 1, 2013 Ensure effectiveness of the school’s teachers Composition School principal or designee, Assistant/vice principal, Teacher Note: teacher will not participate in evaluation activities except with approval of majority representative Duties Oversee mentoring, Conduct evaluations, Identify professional development opportunities Conduct mid-year evaluation of any teacher rated ineffective or partially effective in most recent annual summative evaluation

11 Summary of Lessons Learned from Cohort 1 Pilots 11 Stakeholder engagement is critical; open district advisory committee meetings facilitate transparency and trust Timely, comprehensive, and quality training of educators and evaluators must be emphasized Selection of a teaching practice instrument takes time and should include stakeholder input Administrators face capacity challenges Developing measures for non-tested grades and subjects is challenging

12 Lessons Learned from Cohort 1: End of Year Reports 12 Key Successes: Training and Communication Strategies for training: ∙ Onsite vendor support ∙ Online video exemplars ∙ District Q&A sessions ∙ Analysis of double-scoring Strategies for communication: ∙ Working with DEACs ∙ Sharing information with EPAC ∙ Engaging with NJDOE supports ∙ Sharing resources and information Key Challenges: Time Constraints and Assessing NTGS Time constraints: ∙ Scheduling training ∙ Scheduling/completing observations ∙ Gaps between training/observations ∙Scheduling DEAC meetings Assessing NTGS: ∙ Identifying/developing assessments ∙ Aligning with new standards ∙ Updating technology platforms ∙ Seeking guidance from NJDOE

13 Future Pathways for Evaluation 13 2014-15 Cycle of continuous improvement Ongoing data collection and analysis Applying lessons learned and modifying policies as needed 2013-14 Final report on pilots Support for statewide implement- ation Learn from implement- ation challenges Appropriate course adjustments Possible additional regulatory changes Learn from Implement- ation results

14 Website and Contact Information Website: http://www.state.nj.us/education/evaluation http://www.state.nj.us/education/evaluation Contact information: – For general questions, please email educatorevaluation@doe.state.nj.us educatorevaluation@doe.state.nj.us – Phone: 609-777-3788 14


Download ppt "Educator Evaluation Reform in New Jersey November 16, 2012."

Similar presentations


Ads by Google