Download presentation
Presentation is loading. Please wait.
Published byVivian Woods Modified over 8 years ago
1
A Model For Change William A. Bussey Superintendent Mid-East Career and Technology Centers
2
Implementing OTES through the Ohio Appalachian Collaborative (OAC) and Teacher Incentive Funds(TIF) All teachers will be evaluated using the OTES Model-Teacher Performance Standards All administrators are credentialed evaluators All teachers will develop and monitor one Student Learning Objective
3
Review the processes, results, and challenges with both the performance rating and student growth measures Reflect on the good, the bad, and the ugly Review any changes as we transition to the "authentic" evaluation tool for teachers this coming school year I have brought with me the experts! ◦ Barb Funk ◦ Dan Coffman ◦ Scott Sabino ◦ Michelle Patrick
4
OTES Team created in the Spring of 2012 ◦ Executive Director ◦ Building Directors ◦ New Asst. Director-role is evaluation ◦ Asst. Director - role Curriculum and OAC/TIF ◦ Teachers
5
Introduced the OTES framework to all leadership teams District Transformation Team Strategic Compensation Team OTES Team Formative Instructional Practices Team Assessment/Literacy Tech Team HSTW Site Coordinators Train key people within the district Success starts with a solid foundation Teacher led PD would increase staff buy-in
6
◦ Ensure teachers understood: Framework of the new evaluation system New components New tools New process Process would increase teacher/admin collaboration time
7
Initial OTES Staff training ◦ Teacher Performance – OTES Model 7 Teaching Standards Placed in the performance rubric Teacher Evidence Calibration of a standard to a teaching video Completed self-assessment ◦ Student Growth – Student Learning Objectives
8
o November PD Mini Sessions on Formative Instructional Practices, Literacy w/CC Anchor Standards in Reading and Writing, BFK and Value Added, Teacher Evidence o February PD State training manual on Student Growth Measures o Quarterly Meetings used as check points o Conversations with evaluator
9
Pre-Pre Observation Conference ◦ Met with teachers (last two week of October) to review the process timeline and elements (paperwork and documentation) and answer questions. ◦ Follow-up to In-service Day. Self-Assessment Tool (Optional in the process, but discovered to be a MUST DO!) Professional Growth Plan (See example) ◦ Teacher Performance Goal (Goal 2)- Influenced by Self-Assessment Tool and Ohio Teaching Standards ◦ Self-directed ◦ Statement of the goal and how it will be measured ◦ Student Achievement Goal (Goal 1)- Influenced by SLO (Use standard from pre-assessment used to develop SLO) ◦ Specific standard and how it will be measured
12
Teacher Performance Evaluation Rubric (TPER)- (See example) ◦ Evaluation begins with Proficient (“Rock solid teaching”) ◦ Video examples- see NIET Training Modules ◦ Examine key words and phrases embedded in rubric at each level (Proficient, Accomplished and Developing) Pre-Conference ◦ Teachers complete/submit responses using ohiotpes.com ◦ Face-to-face Observation ◦ 30 minutes ◦ Script the lesson!
13
Post-Conference ◦ Follows each observation ◦ Specific questions relating to the lesson and the instructor ◦ Relate to TPER ◦ Area for Reinforcement/Refinement Classroom Walkthroughs (2-5 per teacher) ◦ Shared with the teacher ◦ Opportunity for feedback ◦ Used paper form and ohiotpes.com ◦ The more often, the better! Set schedule early and put it on the teachers. It’s the teachers’ evaluation and their responsibility to provide evidence/documentation relating to the TPER.
14
Round 1 Pre-Conference Observation Walkthrough(s) Post-Conference Round 2 Pre-Conference Observation Walkthrough(s) Post-Conference Summative Performance Rating Conference
16
Use measures of student growth effectively in a high quality evaluation system Make informed decisions on the right measures Make informed decisions about the appropriate weight of measures Increase reliability and validity of selected measures
17
Teachers completed the development, implementation and scoring process SLO timeline with specific due dates, calendar with expectations Teachers created their SLO and chose their growth target Implemented the SLO Calculated the results Three main types of targets used ◦ Whole group ◦ Tiered/grouped targets ◦ Individual Targets
18
Whole group target-one target for all students in SLO ◦ All students will score a 75% or better on the post assessment Tiered/grouped target-range of targets for groups of students ◦ Pre-assessment scores between 0 – 25 would be expected to score between 25-50 on post assessment Individual target-each student in the SLO receives a target score ◦ Using a formula such as (100 – pretest)/2 + the pretest = growth target
19
Teacher Name: Formula MethodSchool: Mid East CTC - Zanesville Campus SLO:Assessment Name: Student Name Baseline Growth TargetFinal ScoreMet Target 1 28 6480Yes 2 20 6048No 3 44 7276Yes 4 28 6476Yes 5 12 56 6 48 7484Yes 7 20 6044No 8 28 6452No 9 40 7088Yes 10 32 6684Yes 11 28 6460No 6 of the 10 students met/exceed their growth target 60%
20
Descriptive Rating Percentage Exceed/Met Numerical Rating TieredFormula Whole Group Most Effective 90-100%5 10%20%10% Above Average 80-89%4 30%20%36% Average 70-79%3 20%09% Approaching Average 60-69%2 20%6%9% Least Effective 59 or below1 20%54%36%
22
Fall 2011 469 students were tested 81% (379) students earned a bronze or better Intervention Provided through KeyTrain Online Spring 2012 90 students were tested 71% (64) students earned a bronze or better
23
2011-2012 Level I 2012-2013 Level II 40% Bronze 50% Silver 5% Gold 5%Not yet 27% Bronze 64% Silver 7% Gold 2%Not yet
24
14 Programs Bronze 1 Program Gold 11 Programs Silver 70% Met or Exceeded Occupational Profile!
25
Fall 2012 467 students were tested 86% students earned a bronze or better Intervention Provided through KeyTrain Online Spring 2013 60 students were tested 55% students earned a bronze or better
26
2012-2013 Level I 2013-2014 Level II 37% Bronze 54% Silver 4% Gold 5%Not yet
27
Content Area2012-20132011-20122010 -2011 English151150146 Math143143.25141.75 Science147.5145143.50
28
Zanesville Campus Subject College Readiness Benchmark201320122011 Biology156149145.2142.5 Chemistry157147144.9144 Algebra I152142142.1142.9 Geometry152144142.9140.3 Algebra II149143144.5142.4 PreCalculus145143144.5141.2 English 10147151147.8144.8 English 11152149150.2147.3 Physics150 US History150 English 9154 English 12153
29
Buffalo Campus Subject College Readiness Benchmark201320122011 Biology156147145.4144.2 Chemistry157145.4143.5 Algebra I152141142.6142.4 Geometry152144143.2141.6 Algebra II149144143.9142.8 PreCalculus145 English 10147154150.9144.4 English 11152151151.9146.3 Physics150 US History150 English 9154 English 12153
34
Teacher Category Value Added Vendor Assessments LEA MeasuresTotal SLO/Other Shared Attribution A (Value Added) 30%10% 50% B (vendor Assessments 10%30%10%50% C (LEA Measures) 40%10%50%
35
Data Analysis and Assessment Literacy Data Analysis and Setting Growth Targets ◦ Data driven decisions – What data? ◦ Engage in transparent conversations around student growth Outliers, class size, averages, ranges Identify trends, patterns, expectations for mastery Gather other available data Zoom-in and identify similarities and differences in students
36
Build staff Assessment Literacy Priority Standards Appropriate assessments Quality Assessment Design Assessment Blue Print reflects instruction Instructional Tasks move students to meet standards
37
Conversations and Collaboration Greater focus on Student Learning Deeper reflection of the: ◦ teaching and learning process ◦ accountability of growth for all students ◦ role student growth plays in determining Educator Effectiveness
38
Policy developed utilizing the template provided by ODE ◦ Simple to allow for flexibility ◦ Change as we negotiate our contract Further development by the district SLO Evaluation Committee for SLO guidelines
39
What made us breathe a sigh of relief when it was over What went well/positive elements Yes, some things were quite positive! Suggestions/ways to use our internal feedback and insight to feed forward
40
Roller Coaster of Emotions WHEW that took some time! ◦ 5 hours per teacher per observation? *gulp* ◦ What do I give up? ◦ Walkthroughs? Technology ◦ eTPES downfalls Other evaluations? Walkthrough data? ◦ Support for all levels of learners
41
Roller Coaster of Emotions Process was overall positive ◦ From Self-Assessments Reflection ◦ Consensus: “It’s not so bad”!! Technology based Trial year! WOOHOOO Focused purpose and common dialogue ◦ Holistic ◦ Rubric ◦ Criteria not a checklist Collaboration ◦ Administrator with associate school principals ◦ Administrators with administrator ◦ Administrators with teachers ◦ Teachers with teachers ◦ Utopia!
42
Self-Assessment – everyone Walkthrough data – form that collects data we need? ◦ Experimenting with Google Forms? ◦ Use to see trends Non-instructional staff evaluations ◦ OSCES, ASCA, and OTES ◦ Input from staff Opportunities for more alignment ◦ for professionals to align all goals; IPDP, OTES/OPES, Resident Educator, ◦ to look for trends and align with PD, ◦ to group professionals with aligned goals as they work together to improve their practice, ◦ to align ourselves as evaluators - do we truly calibrate? Can we better align (with each other) by discussing our ratings and why, etc., etc., etc.
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.