April 19, 2012 SBE Presentation on Performance Evaluations.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Training for Teachers and Specialists
February, 2010 LEA Support Advisory Council. Agenda 2:30-3:00Discuss plan revision process (feedback and support) 3:00-3:30Discuss February workshops.
Open Future Doors through Succession Planning Principal? Curriculum Supervisor? Assistant Superintendent? Special Services Director?
March, 2010 LEA Support Workshop. 11 Agenda 8:30-9:10Welcome and update 9:10-10:00Breakout: As determined by team 10:00-10:10Break 10:10-11:00Breakout:
Session Objectives Begin to understand the goals, purpose and rationale for Program Reviews Learn about the components of implementing Program Reviews.
The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Guide to Compass Evaluations and
May 22, Today is a chance to... - look at the provincial report card template - understand the new process -inform a decision about implementation.
A Roadmap to Successful Implementation Management Plans.
Field Testing Testing the Test March PARCC Consortium 2 Governed by the education chiefs in the states.
February 29,  Name  Regional Rep & Location or Planning Committee  Organization.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
NJDOE TALENT DIVISION OVERVIEW prepared for: New Jersey Association of School Administrators April 30,
BTSA Orientation LAUSD BTSA INDUCTION Make-Up Orientation Session Welcome! LAUSD BTSA INDUCTION Make-Up Orientation Session
NIET Teacher Evaluation Process
5th Annual Innovation Challenge Kick-Off and Overview Info Sessions 1 & 2 Fall 2014 September 9 & 22
Teacher Evaluation Instructional Collaboration Day #2 January 3, 2014.
Quality, Improvement & Effectiveness Unit
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
LAKE COUNTY SCHOOLS System Accreditation Overview of Standards March 3-6, 2013 Susan Moxley, Ed.D. Superintendent Hugh Hattabaugh Chief Academic Officer.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Leading with Wonder National Title I Conference February 2015 U.S. Department of Education Office of State Support (OSS)
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
Designing and Implementing An Effective Schoolwide Program
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Race to the Top Program Update January 30, State Funding 2.
2010 Annual Employee Survey Results
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Understanding Stimulus Funding and Leveraging Philanthropy to Support Long-Term Education Goals A Webinar for the Foundation Community February 16, 2010.
Education in Delaware: ESEA Flexibility Renewal Community Town Hall Ryan Reyna, Office of Accountability.
Campaign Readiness Project Overview Enabling a structured, scalable approach to customer-centric campaigns.
© 2011 University of Texas System/Texas Education Agency Progress Monitoring Campus RTI Implementation: The RTI – Data Management Tool (RTI-DMT) Pamela.
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
The Academy of Pacesetting Districts Introducing...
STATE CONSORTIUM ON EDUCATOR EFFECTIVENESS September 10, 2013.
OFFICE OF FIELD SERVICES SPRING PLANNING WORKSHOP 2012.
1 1 Testing and Accountability/ RttT LEA Monitoring Update Lou Fabrizio, Federal Liaison Summer Leadership Conference, June 2011.
Before you begin. For additional assistance, contact your club’s Information Technology Chairperson or Electronic Learning at:
Race to the Top (RttT): Standards and Assessments RttT State Initiatives Update March 28, 2011.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Deck 3 of 3.
LA LEADS Summer Conference July 25, Discussion Topics GLEEM Overview Preview of Modules Region I Spotlight Training Schedule Questions and.
Contra Costa County Office of Education May 2015 Local Control Funding Formula: Supporting Continuous Improvement.
Presented to: [District] Staff DATE RECOGNIZING EDUCATOR EXCELLENCE [insert district logo]
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
Education Data Services & Educator Evaluation Team Reporting Educator Evaluation Information in EPIMS for RTTT Districts April – May, 2013 Robert Curtin.
School Improvement Leadership: Developing the Action Plan CIA Team Denise Behrends Chad Dumas Beth McCracken February 23, 2012.
Welcome to today’s Webinar: Tier III Schools in Improvement We will begin at 9:00 AM.
School Accreditation School Improvement Planning.
State Advisory Council Community Support Grant Summary Presentation for Policy Committee Meeting December 3, 2012.
AB Miller High School Community Meeting April 13, 2010.
APRIL 2, 2012 EDUCATOR PREPARATION POLICY & PRACTICE UPDATE.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Loudon County Schools External Review Exit Report February 19-21, 2013.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION GRAVES COUNTY SCHOOLS © 2010 AdvancED.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
MA DSAC Collaboration Institute for Special and General Education Leaders Session 2: November 15, 2011 Focus on Clear and Collaborative Relationships.
Diane Mugford – Federal Accountability, ADAM Russ Keglovits – Measurement and Accountability, ADAM Renewing Nevada’s ESEA Waiver Flexibility Request.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
District of Innovation Update: International School Development
Every Student Succeeds Act
RECOGNIZING educator EXCELLENCE
USBE: Tracy Gooley and Tanya Semerad
Overview Edward G. Rendell Governor Tom Gluck Secretary of Education.
Student Growth Measures
Presentation transcript:

April 19, 2012 SBE Presentation on Performance Evaluations

11 Context There are three major routines that comprise DDOEs management of districts Race to the Top/Success plans: RoutinePurposeDDOE Staff InvolvedDistrict Staff InvolvedLocationFrequency Progress Reviews Assess district progress on plan activities and identify opportunities to improve Delivery Unit (DU) Chief Performance Officer DU Deputy Officer District Liaison Chief RTTT manager Others as desired by the Chief On-site at districts 1-3 times a year, depending on grant size and performance Performance Evaluations Assess district performance on plan measures and identify opportunities to improve Secretary of Education Deputy Secretary Chief Performance Officer District Liaison Chief Board Rep. Teacher Rep. RTTT manager Others as desired by the Chief DDOE Cabinet Room 1-2 times a year, depending on grant size and performance Chiefs Workshops Discuss RTTT data and initiatives in PLCs and identify opportunities to improve DDOE Leadership Team DDOE Content Experts as needed Chief 1-2 additional district leaders (as determined by Chief) DDOE Collette Center Monthly during the school year

22 Performance Evaluations Agenda The mid-year performance evaluation conversation focused on: Initial thoughts for what is driving the districts strengths and challenges How the district will dig deeper to really understand what is going on, and How the district will then replicate its strengths and address its challenges Purpose of evaluations The purpose of performance evaluations is to assess the impact of district plans and district performance overall, and to identify opportunities to improve performance before final funding decisions are made in June What we heard Most districts had already begun to drill down into their data at the school and grade-level Most district had clear hypotheses for the drivers of their strengths Across many districts, effective PLCs, RTTT-funded specialists and extended learning time programs were cited as a driver of district strengths Most districts felt that they would need further analysis to understand the root causes of their challenges Across many districts, initiative overload was cited as a potential cause for district challenges

33 Performance Dashboards – Purpose, Status and Use Status and use of dashboards For the February 2012 performance evaluations all data is formative, and only marks progress towards LEAs RTTT goals (which begin in Spring of 2012) All 19 districts will have performance evaluations in June; 14 of the districts had an additional mid-year performance evaluation at the end of February (based on grant size and/or performance to date) The dashboards are draft/for internal use only – please see the Performance Evaluation Overview for more information on this classification Purpose of dashboards The dashboards were the primary focus of LEAs performance evaluations Performance evaluation dashboards provide a picture of LEAs performance against their Race to the Top goals, key state performance measures, and LEA-specific performance measures

44 Performance Dashboards – Guide to Understanding State Example Where are we in winter 2012? Where are we vs. last winter? What was our fall to winter growth? What was our F-W growth vs. last year? Colors are based on district performance vs. the state (green = above the state; red = below the state) Arrows are based on district performance this year vs. the previous year (up = performance has improved; neutral = performance has stayed within 1 percentage point; down = performance has declined) Goals are based on reducing non-proficiency by 50% by 2015 – a similar methodology as was used in the ESEA Flexibility Application The additional students to meet goal calculation is based on the number of students who took the winter test, so it may not be exact Colors are based on district performance vs. the state (green = above the state; red = below the state) Arrows are based on district performance this year vs. the previous year (up = performance has improved; neutral = performance has stayed within 1 percentage point; down = performance has declined) Goals are based on reducing non-proficiency by 50% by 2015 – a similar methodology as was used in the ESEA Flexibility Application The additional students to meet goal calculation is based on the number of students who took the winter test, so it may not be exact What is our Spring 2012 Goal? How many more students are needed to meet the goal? What is our Spring 2015 Goal?

55 State Data Please see your handout for an overview of statewide trends based on the data.

66 District Data Each of the 14 districts with scheduled performance evaluations received an overview with the following components: Plan highlights (from the plan submitted in June, 2011) Progress review strengths (from the progress review conducted in October/November, 2011) Performance strengths (from the dashboard generated in February, 2012) Opportunities to strengthen performance (from the dashboard generated in February, 2012) Additional relevant trends/data points (from the dashboard generated in February, 2012) All district-specific overviews were shared with the Innovation Action Team.

77 Next Steps Stakeholder Communications DDOE shared the state and district dashboards with all of the stakeholder groups that comprise the Innovation Action Team DDOE provided the opportunity for each stakeholder group to schedule an individual overview of the performance evaluation process and findings Public Communications DDOE publicly released the state dashboard, state summary, and district-specific strengths DDOE will use existing communication opportunities (e.g., the Governors Rotary Club meetings) to highlight the performance evaluation process and findings DDOE will publicly release end-of-year district dashboards in summer 2012 If the states ESEA flexibility application is approved, DDOE will align and disseminate communications regarding the RTTT performance dashboards and the new accountability changes – the two methodologies are very similar, with some differences. Further Analysis DDOE used the February and March Chiefs meetings to further discuss district data and initiatives DDOE is in the process of conducting further data analysis to identify district strengths, coupled with on-site visits in April to help understand the connection between district initiatives and performance data