March 28, 2011. What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%

Slides:



Advertisements
Similar presentations
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: July 2011.
Advertisements

... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.
Teacher Evaluation & APPR THE RUBRICS! A RTTT Conversation With the BTBOCES RTTT Team and local administrators July 20, 2011.
New York State’s Teacher and Principal Evaluation System VOLUME I: NYSED APPR PLAN SUBMISSION “TIPS”
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
OCM BOCES Day 6 Principal Evaluator Training. 2 Nine Components.
OCM BOCES APPR Regulations As of % Student Growth 20% Student Achievement 60% Multiple Measures APPR NOTE: All that is left for implementation.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
LCSD APPR Introduction: NYS Teaching Standards and the Framework for Teaching Rubric Welcome! Please be seated in the color-coded area (marked off by colored.
Targeted Efforts to Improve Learning for ALL Students.
Ramapo Teachers’ Association APPR Contractual Changes.
Annual Professional performance review (APPR overview) Wappingers CSD.
Annual Professional Performance Review (APPR) as approved by the Board of Regents, May 2011 NOTE: Reflects guidance through September 13, 2011 UPDATED.
Day 3. Agenda [always] Aligning RTTT Growth and Value-Added Update 21 st Century Readiness and APPR Evidence Collection Inter-rater agreement.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
NYS Middle Level Liaisons Network As representatives of statewide middle level education, our purpose is to advocate for middle level needs, inform SED.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
SSL/NYLA Educational Leadership Retreat New York State Teacher Evaluation …and the School Librarian John P. Brock Associate in School Library Services.
As Adopted by Emergency Action June, 2015 Slides updated
KEEP And Student Growth Measures for Building Leaders Lawrence School District, May 14, 2014 Bill Bagshaw, Assistant Director, TLA, KSDE Kayeri Akweks,
* Provide clarity in the purpose and function of the Student Learning Objectives (SLOs) as a part of the APPR system * Describe procedures for using.
OCM BOCES Day 7 Lead Evaluator Training 1. 2 Day Seven Agenda.
Update on Accountability Presented by Ira Schwartz, Assistant Commissioner Office of Accountability New York State Education Department 1 March 4, 2011.
Curriculum and Instruction Council June, Welcome and Introductions.
Georgia Association of School Personnel Administrators May 30,
NYS Middle Level Liaisons Network As representatives of statewide middle level education, our purpose is to advocate for middle level needs, inform SED.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
Evaluation Team Progress Collaboration Grant 252.
Measuring Student Growth in Educator Evaluation Name of School.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
APPR Workshop Teacher/Course Collection Presented by Helene Karo Robert E. Lupinskie Center for Curriculum, Instruction and Technology.
The APPR Process And BOCES. Sections 3012-c and 3020 of Education Law (as amended)  Annual Professional Performance Review (APPR) based on:  Student.
Evidence-Based Observations Training for Observers of Teachers Module 5 Dr. Marijo Pearson Dr. Mike Doughty Mr. John Schiess Spring 2012.
OCM BOCES SLOs Workshop. Race To The Top: Standards Data Professional Practice Culture APPR.
The Next Chapter of Annual Professional Performance Review (APPR) as described in the April 15th Draft Regulations.
APPR:§3012-d A Preview of the changes from :§3012-c Overview.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Deck 3 of 3.
Teacher and Principal Evaluations and Discipline Under Chapter 103.
Agenda Introductions Objectives and Agenda Review Principal Evaluation: Different? One Year Later Coaching Principals Collect evidence Support your local.
New Developments in NYS Assessments. What is new? Required use of Standardized Scannable Answer Sheets for all Regents Exams starting in June 2012 Beginning.
PUSD Teacher Evaluation SY 14/15 Governing Board Presentation May 13, 2014 Dr. Heather Cruz, Deputy Superintendent.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Student Growth Percentiles Basics Fall Outcomes Share information on the role of Category 1 assessments in evaluations Outline steps for districts.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
What you need to know about changes in state requirements for Teval plans.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
TEACHER EVALUATION After S.B. 290 The Hungerford Law Firm June, 2012.
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
Day 9. Agenda Research Update Evidence Collection SLO Summative Help Summative Evaluation Growth-Producing Feedback The Start of the Second.
OREGON DEPARTMENT OF EDUCATION COSA PRINCIPAL’S CONFERENCE 2015 ODE Update on Educator Effectiveness.
School Accreditation School Improvement Planning.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
OREGON DEPARTMENT OF EDUCATION COSA LAW CONFERENCE 2015 ODE Update on Educator Effectiveness.
January 2016 Slides updated Emergency Action At their December 2015 meeting, the Board of Regents [again] took emergency action Introduced APPR.
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education September 2010.
CSDCDecember 8, “More questions than answers.” CSDC December 8, 2010.
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
Overview of Network Team Plan.  Deliverable Metrics are being finalized  Assessment of Network Team Deliverables ◦ Principals ◦ Teachers ◦ District.
APPR Updates Office of Teacher/Principal Quality and Professional Development.
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: June 2012.
New Developments in NYS Assessments
SB 1664 Changes to Personnel Evaluations
Erie 2 Regional Curriculum Council March 14, 2012
Lead Evaluator for Principals Part I, Series 1
APPR Overview 3012c Draft Revision March 2012
Implementing Race to the Top
Creating Student Learning Objectives (SLOs)
Title I Annual Meeting Pinewood Elementary, August 30, 2018.
Annual Professional Performance Review APPR
Presentation transcript:

March 28, 2011

What does the new law require?

 20% State student growth data (increases to 25% upon implementation of value0added growth model)  20% Locally selected (and agree upon) measures (decreasing to 15%)  60% Multiple measures based on standards TDB

Being referred to as HEDI (pronounced Heidi)  Highly effective (possibly >90)  Effective (possibly 80-90)  Developing (possibly 65-79)  Ineffective (possibly 0-64)

 A single composite score of teacher (or principal) effectiveness

 Training for all evaluators (through Network Teams – after first week of August)  Use of improvement plans for developing and ineffective ratings  Utilize in other decisions (merit, etc.)  Locally-developed appeals process  Expedited 3020a process after two ineffective ratings

 All agreements after July 1, 2010  For agreements prior to July 1, 2010, it depends on specific language in agreement  4-8 math and ELA (and principals) July 2011  Everyone else July 2012  Implementation of the value-added growth model (20% > 25%)

 All agreements after July 1, 2010  For agreements prior to July 1, 2010, it depends on specific language in agreement  4-8 math and ELA (and principals) July 2011  Everyone else July 2012  Implementation of the value-added growth model (20% > 25%)

Board of Regents Agenda

MONTH  January  February  March  April  May  June ACTION  60% discussion  Local 20% discussion  Value added 20% discussion and ratings/scores  Regents Task Force recommendations (4 th )  Draft Regulations  Emergency Adoption of Regulations

20% increasing to 25%

 Value Added/Growth model  Annual achievement is more about the students than the teacher Teacher A Teacher B

 Value Added/Growth model  Adding average prior achievement for the same students shows growth Teacher A Teacher B growth +25 growth

 Value Added/Growth model  Adding average prior achievement for the same students shows growth Teacher A Teacher B growth +25 growth

 Value Added/Growth model  But what growth should students have shown?  What growth did similar students obtain?  What is the difference between the expected growth and the actual growth?

 Value Added/Growth model  Comparing growth to the average growth of the similar student is the value-added Teacher A Teacher B growth avg. for similar students +25 growth +15 val add avg. for similar students +5 val add

 Value Added/Growth model  Comparing growth to the average growth of the similar student is the value-added Teacher A Teacher B growth avg. for similar students +25 growth +15 val add avg. for similar students +5 val add

 Calculating similar student growth  Lots of statistical analysis  Student characteristics such as academic history, poverty, special ed. status. ELL status, etc.  Classroom or school characteristics such as class percentages of needs, class size, etc.

 Data collection and policy options  Linking students, teachers, and courses  Who is the teacher of record? ▪ Scenario 1: Same Teacher the Entire Year ▪ Scenario 2: Team Teaching ▪ Scenario 3: Teacher for Part of the Year ▪ Scenario 4: Student for Part of the Year ▪ Scenario 5: Student Supplemental Instruction ▪ Additional Scenarios???

Non-tested areas

 Teachers of classes with only one state test administration  K-12 educators  High school (no test) educators  Middle and elementary (no test) educators  Performance courses  Others

 Use existing assessments in other content areas to create a baseline for science tests and Regents examinations  Use commercially available tests to create a baseline and measure growth

 Add more state tests, such as:  Science 6-8  Social studies 6-8  ELA 9-11 ( )  PARCC ELA 3-11 ( )  PARCC math 3-11 ( )

 Add more state tests, according to December 2009 Regents Item; discussed and approved prior to inclusion in SED’s plans:  ELA 9-11 ( )

 Add more state tests, subject to funding availability and approval, such as:  Science 6-7  Social studies 6-8

 % growth model also can be used for school accountability measures  Collaborate with state-wide professional associations or a multi-state coalition  Empower local level resources to create and carry out a solution that meets state requirements

 Use a group metric that is a measure of the school (or grade’s) overall impact  In other states where this is implemented it tends to be tied to performance bonuses

20% decreasing to 15%

 Objectives include:  Provide a broader picture of student achievement by assessing more  Provide a broader picture by assessing differently  Verify performance of state measures

 Reality check:  Balance state/regional/BOCES consistency while accounting for local context  School-based choice might appeal to teachers  Districts must be able to defend their decisions about the tests

 Considerations include:  Rigor  Validity and reliability  Growth or achievement measures  Cost  Feasibility  May be achievement or growth measure

 Options under consideration:  Districts choose or develop assessments for courses/grades  Commercially available products  Group metric of school or grade performance  Other options that meet the criteria (previous slide)

Multiple measures

 Begins with the teaching standards: 1. Knowledge of Students and Student Learning 2. Knowledge of Content and Instructional Planning 3. Instructional Practice 4. Learning Environment 5. Assessment for Student Learning 6. Professional Responsibilities and Collaboration 7. Professional Growth

 Begins with the teaching standards:  Some things observable  Some not observable thus requiring some other form or documentation or artifact collection

 Teacher practice rubrics:  Describe differences in the four performance levels  Articulate specific, observable differences in student and teacher behavior  Not known whether there will be a single rubric, menu to choose from, or total local option

 Teacher practice rubrics:  Describe differences in the four performance levels  Articulate specific, observable differences in student and teacher behavior  Not known whether there will be a single rubric, menu to choose from, or total local option

 Other items that might be included:  Teacher attendance  Goal setting  Student surveys  Portfolios/Evidence binders  Other observer

Board of Regents Agenda

MONTH  January  February  March  April  May  June ACTION  60% discussion  Local 20% discussion  Value added 20% discussion and ratings/scores  Regents Task Force recommendations  Draft Regulations  Emergency Adoption of Regulations

MONTH  August  September ACTION  NT Training (included evaluator training)  NT turns training to local evaluators  Implementation for covered teachers

 Tentative dates set (with multiple options):  August 15, Rodax 8 Large Conference Room  August 22, McEvoy Conference Center  August 29, Rodax 8 Large Conference Room  Ongoing training during year (TBD)

 Tentative dates set (with multiple options):  August 19, Rodax 8 Small Conference Room  August 26, McEvoy Conference Center  Ongoing training during year (TBD)

 Regional/BOCES collaboration:  Share data  Share APPR Plans  Build common understanding  Work on parts under local jurisdiction  Avoid duplication of work  Have a common voice

 APPR sub-site:  APPR button under “for school districts” at ocmboces.org or leadership.ocmboces.org ocmboces.orgleadership.ocmboces.org  User name: lrldocs  Password: CBA1011

 Regional/BOCES collaboration:  Development of local 20% protocol  Achievement in non-tested areas  Qualities of effective Improvement plans and examples  Appeals process  Frameworks/models  Summative evaluation (examples, best practices, share practices)  Principal Evaluation (added back)

 Share results of this afternoon’s work  Gather again on __________  Updates  Continue collaboration