Discuss the charge of the Michigan Council for Educator Effectiveness (MCEE) Summarize the MCEE Interim Report Provide an Overview of the Pilot
Evaluations Evaluations must occur annually, must take place at the end of the year, and be based on multiple, rather than at least two observations MUST implement that rating system by September 10, 2011 highly effective, effective, minimally effective or ineffective Beginning with the 2013-2014 school year, evaluation system for teachers and administrators that is based largely on student growth and assessment data
2013-2014 school year, at least 25% 2014-2015 school year, at least 40% 2015-2016 school year, at least 50%
The annual year-end evaluation shall be based on the student growth and assessment data for the most recent 3 consecutive school-year period. If none available for a teacher for at least 3 school years, the annual year-end evaluation shall be based on all assessment data that are available for the teacher.
Observations Shall include review of lesson plan Shall include state curriculum standard being used in lesson Shall include a review of pupil engagement in the lesson Observation does not have to be for entire class period
Ineffective Ratings Beginning with the 2015-16 school year, a board must notify the parent of a student assigned to a teacher who has been ineffective on his or her two most recent annual year-end evaluations. Any teacher or administrator who is rated ineffective on three consecutive annual year-end evaluations must be dismissed from employment.
Districts are not required to comply with Governors teacher/administrator evaluation tools if they have an evaluation system that: Most significant portion is based on student growth and assessment data Uses research based measures to determine student growth Teacher effectiveness and ratings, as measured by student achievement and growth data, are factored in teacher retention, promotion and termination decisions Teacher/administrator results are used to inform teacher of professional development for the succeeding year Ensures that teachers/administrators are evaluated annually Must notify Gov. Council by November 1st of exemption
Appointed by Governor Snyder: Deborah Ball Mark Reckase Nick Sheltrown Appointed by Senate Majority Leader: David Vensel Appointed by Speaker of the House: Jennifer Hammond (Non-voting Member) Appointed by Superintendent of Public Instruction: Joseph Martineau
Advisory committee appointed by the Governor Provide input on the Councils recommendations Teachers, administrators, parents
No later than April 30, 2012 the Council must submit: A student growth and assessment tool A value-added model Measures growth in core areas and other areas Complies with laws for students with disabilities Has at least a pre- and post-test Can be used with students of varying ability levels
No later than April 30, 2012 the Council must submit: A state evaluation tool for teachers (general and special education teachers) Including instructional leadership abilities, attendance, professional contributions, training, progress reports, school improvement progress, peer input and pupil and parent feedback Council must seek input from local districts
No later than April 30, 2012 the Council must submit: A state evaluation tool for administrators Including attendance, graduation rates, professional contributions, training, progress reports, school improvement plan progress, peer input and pupil/parent feedback Recommended changes for requirements for professional teaching certificate A process for evaluating and approving local evaluation tools
Vision Statement: The Michigan Council for Educator Effectiveness will develop a fair, transparent, and feasible evaluation system for teachers and school administrators. The system will be based on rigorous standards of professional practice and of measurement. The goals of this system is to contribute to enhanced instruction, improve student achievement, and support ongoing professional learning.
Selection Criteria 1. Alignment with State Standards 2. Instruments describe practice and support teacher development 3. Rigorous and ongoing training program for evaluators 4. Independent research to confirm validity and reliability 5. Feasibility
Systems 1. Marzano Observation Protocol 2. Thoughtful Classroom 3. Five Dimensions of Teaching and Learning 4. Framework for Teaching 5. Classroom Assessment Scoring System 6. TAP
Lesson Learned 1. Pilot is essential 2. Phasing in 3. Number of observations 4. Other important components
Challenges 1. Being fiscally responsible 2. Ensuring fairness and reliability 3. Assessing the fidelity of protocol implementation 4. Determining the equivalence of different instruments
Recognize that student growth can give insight into teacher effectiveness Admit that student growth is not clearly defined Descriptions of growth vary and include: Tests Analytic techniques for scoring Measures of value-added modeling Simple vs. Complex statistics VAM
New York: 40% state assessments w/local assessments approved by State Arizona: 33% - 50% of evaluation, locals determine multiple measures, tested subject areas must use State data for one measure Colorado: 50% growth on state assessments and other measures for non-tested subjects Delaware: 50% school-wide assessment measure based on State assessment (30%) and student cohort assessment (20%) Florida: 50% on State assessments for teachers in tested subjects (40% if less than 3 years of data) and 30% on State assessment for teachers in non-tested subjects
Challenges 1. Measurement error in standardized and local measurements 2. Balancing fairness toward educators with fairness toward students 3. Non-tested grades and subjects 4. Tenuous roster connections between students and teachers 5. Number of years of data
Question #1: Should the State evaluation data (i.e. MEAP, MME, etc.) be the only source of student growth data? Why or why not? Question #2: Should local student growth models be allowed? Why or why not? Question #3: If you agree that multiple measures should be allowed, what percentage would you give each of the multiple measures? For example if educators are permitted to use MME data, a local tool such as an end of course assessment, and a personally developed measure how should those three measures be weighted? Question #4: How should we measure teachers in non-tested subjects such as band or auto mechanics?
12 school districts Pilot the teacher observation tool Pilot the administrator evaluation tool Train evaluators, principals and teachers Provide information on validity Gather feedback from teachers and principals 3 observation tools Student growth model/VAM pilot
Estimated Timeline for Completing Recommendations Month/YearRecommendation June 2012 Observation tool(s) Details regarding the 2012-2013 pilot year July 2012 Other components of teacher evaluation systems October 2012 Student growth model November 2012 Evaluation tool for school administrators Details regarding the pilot of administrator evaluations District waiver processes and principles April 2013 Professional certificate June 2013 Review all recommendations and adjust based on new data and information