March Madness Professional Development Goals/Data Workshop.

Slides:



Advertisements
Similar presentations
Overview of the New Massachusetts Educator Evaluation Framework October 2011.
Advertisements

Quality, Improvement & Effectiveness Unit
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Thank you!. At the end of this session, participants will be able to:  Understand the big picture of our new evaluation system  Create evidence-based.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
“SMARTer” Goals Winter A ESE-MASS Workshop for superintendents and representatives from their leadership teams.
The Massachusetts Model System for Educator Evaluation Unpacking the Rubrics and Gathering Evidence September 2012 Melrose Public Schools 1.
Educator Evaluation Workshop: Gathering Evidence, Conducting Observations & Providing Feedback MSSAA Summer Institute July 26, 2012 Massachusetts Department.
 Reading School Committee January 23,
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
EDUCATOR EVALUATION August 25, 2014 Wilmington. OVERVIEW 5-Step Cycle.
Collecting Artifacts: Showcasing Your Best Work!
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Artifacts 101: The Role of Artifacts in Educator Support and Evaluation Spring 2015 Webinar.
Staff & Student Feedback The Role of Feedback in Educator Evaluation January 2015.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Slide 1 is the title slide.
Observation Process and Teacher Feedback
Professional Growth= Teacher Growth
Differentiated Supervision
Educator Evaluation: The Model Process for Principal Evaluation July 26, 2012 Massachusetts Secondary School Administrators’ Association Summer Institute.
Educator Effectiveness and The Common Core State Standards October 20, 2013 Bend Oregon.
Session Materials  Wiki
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
SCPS is…  We are a high-performing district  We are focused on student achievement  We are committed to achieving excellence and equity through continuous.
1-Hour Overview: The Massachusetts Framework for Educator Evaluation September
1 Orientation to Teacher Evaluation /15/2015.
CLASS Keys Orientation Douglas County School System August /17/20151.
New Teacher Introduction to Evaluation 08/28/2012.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
 Reading Public Schools Staff Presentations March 30, 2012.
Setting purposeful goals Douglas County Schools July 2011.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
Evaluation Team Progress Collaboration Grant 252.
The Instructional Decision-Making Process 1 hour presentation.
Type Date Here Type Presenter Name/Contact Here Making Evaluation Work at Your School Leadership Institute 2012.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
South Western School District Differentiated Supervision Plan DRAFT 2010.
The Delaware Performance Appraisal System II (DPAS II) for Teachers Training Module I Introduction to DPAS II Training for Teachers.
Educator Evaluation 101: A Special Overview Session for Educator Preparation Programs May 2013.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Idaho Principal Evaluation Process Tyson Carter Educator Effectiveness Coordinator Idaho State Department of Education
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
MWSD. Differentiated Supervision Mode (DSM)  Reference Pages in Plan Book 8-16 Description of Differentiated Mode Relevant Appendices 34 Teacher.
The Use of Artifacts as Evidence in Educator Evaluation Fall 2015.
Vision Statement We Value - An organization culture based upon both individual strengths and relationships in which learners flourish in an environment.
 Teachers 21 June 8,  Wiki with Resources o
Goal Setting in Educator Evaluation Sept. 11 th,
Type Date Here Type Presenter Name/Contact Here Supporting Effective Teaching: An Introduction to Educator Performance Evaluation Introduction to Educator.
Calibrating Feedback A Model for Establishing Consistent Expectations of Educator Practice Adapted from the MA Candidate Assessment of Performance.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Welcome TDEC Professional Learning December 22,
Educator Supervision and Evaluation Clarke and Diamond MS September 2013.
DECEMBER 7, 2015 Educator Effectiveness: Charter School Webinar.
The New Educator Evaluation System
The New Educator Evaluation System
The New Educator Evaluation System
Educator Effectiveness Regional Workshop: Round 2
DESE Educator Evaluation System for Superintendents
Discussion and Vote to Amend the Regulations
Administrator Evaluation Orientation
SGM Mid-Year Conference Gina Graham
Baldwin High School Wednesday, August 21, 2013
Presentation transcript:

March Madness Professional Development Goals/Data Workshop

Welcome! Objectives of March Madness Educator Evaluation PD: To provide participants with implementation tips and strategies to help make the teacher evaluation process meaningful, doable, and focused on increased student learning.

Massachusetts Department of Elementary and Secondary Education 3 Priorities of the new evaluation framework Place Student Learning at the Center – Student learning is central to the evaluation and development of educators Promote Growth and Development – Provide all educators with feedback and opportunities that support continuous growth and improvement through collaboration Recognize Excellence – Encourage districts to recognize and reward excellence in teaching and leadership Set a High Bar for Tenure – Entrants to the teaching force must demonstrate Proficient performance on all standards within three years to earn Professional Teacher Status Shorten Timelines for Improvement – Educators who are not rated Proficient face accelerated timelines for improvement We want to ensure that each student in the Commonwealth is taught by an effective educator, in schools and districts led by effective leaders.

The 5-Step Evaluation Cycle A Step-by-Step Review Massachusetts Department of Elementary and Secondary Education 4

5 5 Step Evaluation Cycle Continuous Learning  Every educator is an active participant in their own evaluation  Process promotes collaboration and continuous learning Massachusetts Department of Elementary and Secondary Education

Step 1: Self-Assessment  Educators self-assess their performance using:  Student data, and  Performance rubric  Based on the Standards and Indicators of Effective Teaching Practice and/or Administrative Leadership  Educators propose goals related to their professional practice and student learning needs 6 Part II: School Level Guide Pages 14-22

Step 2: Analysis, Goal Setting and Plan Development  Educators set S.M.A.R.T. goals:  Student learning goal  Professional practice goal (Aligned to the Standards and Indicators of Effective Practice)  Educators are required to consider team goals  Evaluators have final authority over goals 7 Part II: School Level Guide Pages 23-31

A “S.M.A.R.T.er GOAL” A Goal Statement + Key Actions + Benchmarks (Process & Outcome) = Educator Plan 8

Step 3: Implementation of the Plan  Educator completes the planned action steps of his/her plan  Educator and evaluator collect evidence of practice and goal progress, including:  Multiple measures of student learning  Observations and artifacts  Additional evidence related to performance standards  Evaluator provides feedback 9 Part II: School Level Guide Pages 32-39

Strategic Evidence Collection  Prioritize based on goals and focus areas  Quality not quantity  Artifacts should be “naturally occurring” sources of evidence (e.g. lesson plans)  Consider common artifacts for which all educators are responsible 10

Observations  The regulations define Proficient practice with regard to evaluation as including “frequent unannounced visits to classrooms” followed by “targeted and constructive feedback to teachers” (604 CMR 35.04, “Standards and Indicators of Effective Administrative Leadership Practice)  The Model System recommends short, frequent unannounced observations for all educators, as well as at least one announced observation for non-PTS and struggling educators. 11

Step 4: Formative Assessment/ Evaluation  Occurs mid-way through the 5-Step Cycle  Typically Jan/Feb for educators on a 1-year plan (formative assessment)  Typically May/June for educators on a 2-year plan (formative evaluation)  Educator and Evaluator review evidence and assess progress on educator’s goals 12 Part II: School Level Guide Pages 40-47

Step 5: Summative Evaluation  Evaluator determines an overall summative rating of performance based on:  Comprehensive picture of practice captured through multiple sources of evidence  Summative Performance Rating reflects:  Ratings on each of the four Standards  Progress toward goals 13 Part II: School Level Guide Pages 48-53

Implementation tips and strategies 1. Data collection and interpretation-- accessing and analyzing reliable and relevant data points to determine whether students have met the goals set for them. 2. Strategic collection of evidence– increasing intentional, meaningful evidence collection. 3. EOY conferencing session– maximizing the positive impact of this conversation on your teaching practice.

1. Focusing on SL goals…  What data do you plan to use to track students’ progress toward the goal you have set?  What adjustments (if any) have you made to your instructional practice as a result of any formative student data connected to this goal?  What assistance do you need from your evaluator as you move forward with this work?

2. Strategic collection of evidence  You are in the driver’s seat! You decide what will be used in artifact collection.  To focus on strategic evidence collection– ask yourself what evidence is emerging authentically from the implementation of your goals. Use these pieces first to show proficiency in as many indicators as possible.  Then, for the remaining indicators not addressed within the parameters of the goals you set, look at other components of your teaching practice to gather relevant evidence.

3. How will you share the story of your students’ learning this year…  Frame the progress of your SL and PP goals in order to share the story they create with your evaluator.  Beginning– steps one and two  Middle– step three  End– steps four and five

Summative conference  Use observation sheet while viewing to capture what you see and hear the evaluator and teacher saying and doing in regards to increased student learning outcomes. Any connections can you make to your own evaluation meeting? Turn to your neighbor and share your observations.

Strategic Planning Time  Analyze the components of your SL (or PP) goal, cross referencing against the evaluation rubric--  What authentic evidence from your goals could be used as artifacts during your EOY meeting?  Are there any connected pieces?- for example formative assessment data that triggers a differentiated lesson plan, as well as an analysis of student work samples through collegial collaboration.

Create an action plan What are your next steps for the completion of your SL goal analysis and EOY evaluative meeting?

Share out session  Describe the take-aways of this session in regards to synthesizing the process and the actions you plan to take to ensure student learning outcomes.