Presentation is loading. Please wait.

Presentation is loading. Please wait.

Pilot of State Model Principal Evaluation System Year One Pilot of S.B. 191 Colorado Department of Education Educator Effectiveness September 12, 2012.

Similar presentations


Presentation on theme: "Pilot of State Model Principal Evaluation System Year One Pilot of S.B. 191 Colorado Department of Education Educator Effectiveness September 12, 2012."— Presentation transcript:

1 Pilot of State Model Principal Evaluation System Year One Pilot of S.B. 191 Colorado Department of Education Educator Effectiveness September 12, 2012

2 Evaluation and Continuous Improvement of the Statewide System to Evaluate the Effectiveness of Licensed Personnel The first year of the pilot went very well. While we have some interesting findings that we are monitoring, we do not have any recommended rule changes at this time. We believe the rules instituted in November are still appropriate and relevant for the upcoming year. Section 6.05 of the Rules for Implementation of S.B. 191: The Department shall use information obtained through monitoring and reporting efforts to identify opportunities for improvement. No later than July 1 of each year, beginning in 2012, the State Board shall review these rules (1 CCR 301-87) and, informed by recommendations from the State Council and using information from implementation of the State Model System and other local systems, shall determine whether to affirm or revise the rules in order to reflect what has been learned.

3 Year One 2011-12 Development and Beta Testing CDE ACTIVITIES Develop State Model Systems for teachers and principals Beta-testing of rubrics and tools Develop technical guidelines on Prof Practices and Student Growth Provide differentiated support for districts Populate and launch online Resource Bank Develop state data collection and monitoring system Develop tools for district implementation of system Year Two 2012-13 Pilot and Rollout CDE ACTIVITIES Usability study of rubrics Support pilot districts through resources, training, tools, etc. Convene pilot districts to share lessons learned Analyze pilot district data and make adjustments as needed Train ALL non-pilot districts that are using the state model Make Recommendations on other licensed personnel (OLP) to State Board of Education (SBE) Year Three 2013- 14 Pilot and Rollout CDE ACTIVITIES Statewide assistance on rollout of evaluation systems Develop evaluation system for other licensed personnel Support all districts through resources, trainings, tools, etc. Convene pilot districts to share lessons learned Analyze state data and make adjustments to the system as needed Validate teacher and principal rubrics Dev. criteria for evaluation training courses for approval by CDE Year Four 2014-15 Full Statewide Implementation CDE ACTIVITIES Finalize statewide implementation of teacher/principal systems Pilot OLP rubrics Continue support to districts via resources and training Ensure there are evaluator training courses throughout the state Analyze data and make adjustments as needed Make recommendations to SBE this year and all following years for Continuous Improvement Timeline of Implementation

4

5 Piloted in 27 districts across the state Principal pilot data received from 22 districts Principal Evaluation System CDE Pilot DistrictCDE/CLF Integration District 1 Center1 Centennial 2 Crowley2Archuleta 3 Custer3Bayfield 4 Del Norte4Dolores RE-2 5 Eads5Dolores RE-4 6 Jefferson6Durango 7 Miami-Yoder 7Eagle 8 Moffat8Ignacio 9 Mountain Valley9Mancos 10 Platte Canyon10Montezuma Cortez 11 Salida11Silverton 12 South Routt12 Thompson 13 St. Vrain 14 Valley Sterling 15 Wray # of evaluations expected* # of evaluations received% received Statewide 52624146% Range0-2980-1310-200% * total # of administrators

6 Overall Reflections on the Pilot Managing change Trainings of Superintendents, Principals, and Assistant Principals throughout the fall of 2011 “Finally we have a road map” Learning about management of the process Interest and curiosity on Student Growth Overall, we are getting positive response

7 Quality Standard 1: Principals demonstrate strategic leadership. –Element a: School Vision, Mission and Strategic Goals –Element b: School Plan –Element c: Leading Change –Element d: Distributive Leadership Quality Standard 2: Principals demonstrate instructional leadership. –Element a: Curriculum, Instruction, Learning and Assessment –Element b: Instructional Time –Element c: Implementing High-quality Instruction –Element d: High Expectations for all Students –Element e: Instructional Practices Quality Standard 3: Principals demonstrate school culture and equity leadership. –Element a: Intentional and Collaborative School Culture –Element b: Commitment to the Whole Child –Element c: Equity Pedagogy –Element d: Efficacy, Empowerment and a Culture of Continuous Improvement Principal Quality Standards and Elements Quality Standard 4: Principals demonstrate human resource leadership. –Element a: Professional Development/Learning Communities –Element b: Recruiting, Hiring, Placing, Mentoring, and Dismissal of Staff –Element c: Teacher and Staff Evaluation Quality Standard 5: Principals demonstrate managerial leadership. –Element a: School Resources and Budget –Element b: Conflict Management and Resolution –Element c: Systematic Communication –Element d: School-wide Expectations for Students and Staff –Element e: Supporting Policies and Agreements –Element f: Ensuring an Orderly and Supportive Environment Quality Standard 6 : Principals demonstrate external development leadership. –Element a: Family and Community Involvement and Outreach –Element b: Professional Leadership Responsibilities –Element c: Advocacy for the School. The italicized elements were not included in the 2011-12 pilot of the principal evaluation system but will be included in the 2012-13 pilot.

8 Ratings on Quality Standards Timeframe of the cumulative evaluation was October 2011 – May 2012. Principal evaluators were instructed to conduct a mid-year review and an end-of-year evaluation (end-of-year evaluation ratings are reported in this presentation). Individual districts determine how many (or if) observations were conducted. Evaluation ratings include observable and non-observable components of Quality Standards 1-6, for an overall assessment of Principal performance.

9 Ratings Distributions: Standards Summary and Overall Rating Standard 1 – Strategic Leadership Standard 2 – Instructional Leadership Standard 3 – School Culture and Equity Leadership Standard 4 – Human Resource Leadership Standard 5 – Managerial Leadership Standard 6 – External Development Leadership 90% of principals received a rating of Proficient or above on the Overall Rating. Principals received higher ratings on Standard 4 (HR Leadership) and lower ratings on Standard 1 (Strategic Leadership).

10 Strategic Leadership is the lowest rated Standard, with 86% of educators given a rating of Proficient or higher. Element 1a – School Vision, Mission and Strategic Goals Element 1b – School Improvement Planning Element 1c – Leading Change Element 1d – Distributive Leadership Standard 1 – Strategic Leadership

11 Instructional Leadership contains the highest rated Element (Element 2b), with 95% of educators receiving a rating of Proficient or higher. This Standard also contains the lowest rated Element (Element 2c), with 69% of educators receiving a rating of Proficient or higher. Element 2a – Curriculum, Instruction, Learning and Assessment Element 2b – Instructional Time Element 2c – Implementing High Quality Instruction Element 2d – High Expectations for All Students Standard 2 – Instructional Leadership

12 NStandard 1Standard 2Standard 3Standard 4Standard 5Standard 6 Overall Rating CSAP Reading MGP211.316 **.294 **.278 **.224 **.271 **.095.309 ** CSAP Math MGP211.262 **.221 **.237 **.175 *.192 **.053.273 ** CSAP Reading % of students proficient or advanced 213.134.071.132.106.191 **.127.133 CSAP Math % of students proficient or advanced 213.202 **.178 **.196 **.161 *.190 **.115.160 * ** Correlation is significant at the 0.01 level * Correlation is significant at the 0.05 level Correlations indicate the strength of the relationship between two measures (in this case, between evaluation ratings and aggregate CSAP scores). 0 indicates no relationship and 1 indicates a perfect positive relationship. Most of the correlations below are small but statistically significant. On average, principals who received higher evaluation ratings also have higher student growth and achievement in their school. Principal ratings were positively correlated with: –CSAP Reading growth –CSAP Math growth and achievement Correlations with Students’ Academic Outcomes Student Growth Student Achieve- ment

13 Fairness - the extent to which a system is marked by impartiality and free from favoritism –Rubric ratings don’t vary based on student, school, or principal demographic characteristics (e.g., don’t vary based on the % of minority students in a school or the principal’s gender).  These findings indicate that the rubric reflects a common standard that is fair and equally applicable across different demographic groups. Validity - the extent to which a variable, scale, or set of measures accurately represents the concept of interest –Rubric text is aligned with other content and leadership standards. –Element ratings are internally cohesive and will group into established Standards. –Rubric ratings are correlated with other measures of school success, specifically the aggregated CSAP growth and achievement in the school. –Rubric ratings don’t vary based on student or school characteristics, indicating that the rubric is valid across different demographic groups.  These findings indicate that the rubric is measuring the right things, we’re measuring what we think we’re measuring, and what we’re measuring is aligned with other measures of success. Investigating Whether the System is Fair and Valid

14 Reliability - the extent to which a variable or set of variables is consistent in what it is intended to measure –Rubric ratings at the Standard level are highly correlated with each other. –Rubric ratings at the Element level are highly correlated with each other. –Elements within Standards are highly correlated with each other. –Strong reliability coefficients for each Standard.  These findings indicate a consistency of ratings across Standards, across Elements, and across Elements within their designated Standards. For example, principals who received a rating of Exemplary on Standard 1 were more likely to receive Exemplary ratings on the other Standards.  These findings indicate that each Standard captures a dimension of school leadership. Investigating Whether the System is Reliable

15 Principal Baseline and Feedback Surveys Principals were surveyed before (i.e., baseline) and after (i.e., feedback) they experienced the state model system in the 2011-12 school year. Across the board, principals responded more positively in the post-survey to questions about the state model system.

16 Next Steps in the Pilot of the State Model Evaluation System Run additional analyses on 2011-12 principal pilot data –Example: Compare with data from the Teaching, Empowering, Leading and Learning (TELL) Survey (TELL is the state’s biennial educator perception survey) Continue to pilot the principal evaluation system in the 2012-13 school year Pilot the teacher evaluation system in the 2012-13 school year Build out the student growth component Focus on inter-rater agreement, which involves continued training and calibration of evaluators Continue to collect rubric and other evaluation data in order to conduct studies of reliability and validity


Download ppt "Pilot of State Model Principal Evaluation System Year One Pilot of S.B. 191 Colorado Department of Education Educator Effectiveness September 12, 2012."

Similar presentations


Ads by Google