Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

FLORIDA’S VALUE ADDED MODEL FLORIDA’S VALUE ADDED MODEL Overview of the Model to Measure Student Learning Growth on FCAT January
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Changes To Florida’s School Grades Calculations Adopted By The State Board Of Education On February 28, 2012 Prepared by Research, Evaluation & Accountability.
© 2014, Florida Department of Education. All Rights Reserved. Mary Jane Tappen Executive Vice Chancellor Division of Public Schools.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Educator Evaluations Education Accountability Summit August 26-28,
October Principal Meeting Proposed Framework for the School Grading System for and Beyond Gisela Feild Administrative Director Assessment, Research.
 Gisela Feild Administrative Director Assessment, Research and Data Analysis July 2014.
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
1 School Grades & AMO Overview Paul Houchens Director Student Assessment & Research.
2012 Traditional SPF Background & Measures September 17, 2012.
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
Overview of SB 736 Legislation Pertaining to Personnel Evaluation Systems and Race to the Top 1.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
Our Shared Agenda: Empowering Effective Teaching Florida Educational Negotiators Annual Conference.
How Can Teacher Evaluation Be Connected to Student Achievement?
Student Learning Growth Details November 27 th and November 29th.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
End of Course Assessments School Year English Language Arts, Math, Biology, and Government.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
Measuring Student Growth in Educator Evaluation Name of School.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
The APPR Process And BOCES. Sections 3012-c and 3020 of Education Law (as amended)  Annual Professional Performance Review (APPR) based on:  Student.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
ACCOUNTABILITY UPDATE Accountability Services.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Santa Ana Unified School District 2011 CST Enter School Name Version: Intermediate.
Teacher SLTs General Format for Teacher SLTs with a District-wide Common Assessment The percent of students scoring proficient 1 in my 8 th.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Overview of the Model to Measure Student Learning Growth on FCAT as developed by the Student Growth Implementation Committee Juan Copa, Director of Research.
Kingsville ISD Annual Report Public Hearing.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
MCAS Progress and Performance Index Report 2013 Cohasset Public Schools.
Accountability and School Grades FY 16 Charter Schools Principal’s Meeting March 17, 2016 Everglades Preparatory Academy.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
© 2014, Florida Department of Education. All Rights Reserved. Accountability Update School Grades Technical Assistance Meeting.
Bay District Schools Non-FCAT Tested Decisions.
Florida Algebra I EOC Value-Added Model June 2013.
FLORIDA STATE EOC EXAMS. **Districts will select testing dates. REQUIRED TESTING Students enrolled in the Florida Public Schools are required to participate.
Accountability and School Grades FY 17 Charter Schools Principal’s Meeting August 24, 2016 Pew Center.
New Teacher Induction.
Mark Howard, Chief Performance Accountability
February 2012 State Board Ruling: School Grade Calculations
Accountability Update
INSTRUCTIONAL EVALUATION SYSTEM
Teacher SLTs
VAM Primer.
Accountability Overview Measures and Results
Teacher SLTs
How to Interpret the District Created Final Exam Teacher Report
ACE August 3, 2012 Dr. Russ Romans District Accountability Manager
FY17 Evaluation Overview: Student Performance Rating
Massachusetts’ Next-Generation Accountability System
Campus Comparison Groups and Distinction Designations
Teacher SLTs
Gisela Feild Administrative Director
Jayhawkville Central High School
School Improvement Ratings Rule 6A , F.A.C.
Release of Preliminary Value-Added Data Webinar
Teacher SLTs
Secondary Data Presentation
Presentation transcript:

Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015

The Statute: (3)(a)1. Performance of students.—At least one-third of a performance evaluation must be based upon data and indicators of student performance in accordance with subsection (7). This portion of the evaluation must include growth or achievement data of the teacher’s students or, for a school administrator, the students attending the school over the course of at least 3 years. If less than 3 years of data are available, the years for which data are available must be used. (7)(b) Each school district shall measure student learning growth using the formulas approved by the commissioner under paragraph (a) and the standards for performance levels adopted by the state board under subsection (8) for courses associated with the statewide, standardized assessments administered under s no later than the school year immediately following the year the formula is approved by the commissioner. For grades and subjects not assessed by statewide, standardized assessments, each school district shall measure student performance using a methodology determined by the district

What is Value-Added? Value-added models measure the influence of schools or teachers on the academic growth rates of students. Value-added compares the change in achievement of a group of students from one year to the next to an expected amount of change based on their prior achievement history and other potential influences.

Comparing Student Performance: Traditional Achievement/Proficiency Models All students are expected to score at or above proficiency levels, regardless of individual characteristics Schools & teachers evaluated based on percentage of students attaining goals Same expectations for all schools & teachers Examples: School grades model Polk evaluation manual for teachers of courses without state-developed VAM data

Florida’s Variables in Value Added Model Up to two years of prior achievement scores Number of subject-relevant courses Disability status English language learner status Gifted status Mobility Attendance Difference from modal age (such as retention) Classroom characteristics – class size and homogeneity of prior test scores

Comparing Student Performance: Value Added Models Students’ individual characteristics, such as number of absences, prior test data, ELL status, etc. are used to create a cohort group of students statewide Students are examined in relation to their peers with same characteristics in cohort group Expectations of performance are differentiated for all students based on these characteristics Schools & teachers evaluated based on students’ difference from predicted individual score Examples: Florida’s value added model District-developed models being considered for evaluations by Polk’s Teacher Evaluation Advisory Committee

What Scores Are Used in the Value Added Model? For , Value Added Models were calculated for: - FSA ELA - FSA Mathematics - Grade 9 FSA Algebra 1 EOC and FCAT Reading and Mathematics scores were used as prior year data

Prior Achievement Scores In subjects that are annually tested, such as ELA and math, it is easy to understand how: - A prior reading test can be predictive of performance on an ELA test e.g., 5 th grade reading performance predicts 6 th grade ELA performance - A prior math test can be predictive of performance on a math test e.g., 4 th grade math performance predicts 5 th grade math performance

Let’s Look at SIMPLE Prediction Basic prediction assumes the student who scored 375 in Year 1 would score similar to all other students who scored 375. If all students who scored 375 in Year 1 score 400 in Year 2, the models would predict the student would perform similar. If the student scores above the prediction, we have value-added.

Predictions Are Not Always In an Upward Direction Some cohorts of students, such as those as the top of the scale, or those who have a significant number of absences or a significant cognitive impairment, may have a negative prediction. This individualization is possible in VAM because of the comparison of students with their peers who have the same characteristics.

Understanding the Value Added Scale A Zero is GOOD!  A zero indicates that the predicted data and the actual data match (like achieving par on a golf course).  + For a positive number – student growth exceeded what was projected.  - For a negative number – students did not show as much growth as expected.

An Analogy GOLF - The player has 18 different holes - The player earns a score at each - Some holes are at par, some are under, some are over - The overall score is an aggregation of the individual scores Par = meeting expectations VALUE ADDED - The teacher has several class periods - The teacher has a score for each - Some students achieve as expected, some underperform, some overperform - The overall score is an aggregation of the individual scores Zero = meeting expectations

Let’s Look Closer at a SIMPLE Prediction

Understanding VAM Classification: Scale Unsatisfactory: below the district average by 2.5 or more standard errors (99% confidence interval) Needs Improvement: below the district average by 1.5 standard errors (89% confidence interval) Effective: within 1.5 standard errors of the district average (89% confidence interval) Highly Effective: above the district average by 2.5 or more standard errors (99% confidence interval)

Finding VAM Data -Navigate to OneDrive -Shared With Me -AAEYourSchoolNumber -File is 3YrAggVAM0031

Collective Bargaining Agreement: Proficiency Models Teachers of state-tested subject areas & grade levels without a state-created VAM Percent of Students Scoring Proficient or Higher on State Test Rating 0%-31%Unsatisfactory 31.01%-41%Needs Improvement/ Developing 41.01%-70%Effective 70.01%-100%Highly Effective

Collective Bargaining Agreement: Proficiency Models Teachers of students in grades 3-10 teaching non-state assessed courses (not AP/IB) Percent of Students Scoring Proficient or Higher on closest state assessment (usually FSA ELA) Rating 0%-31%Unsatisfactory 31.01%-41%Needs Improvement/ Developing 41.01%-70%Effective 70.01%-100%Highly Effective

Collective Bargaining Agreement: Proficiency Models Teachers of students in grades teaching non-state assessed courses (not AP/IB) Percent of Students Scoring Concordant or Higher/College-Ready or Higher (whichever is lower) on ANY college-ready assessment (SAT, ACT, PERT) Rating 0%-31%Unsatisfactory 31.01%-41%Needs Improvement/ Developing 41.01%-70%Effective 70.01%-100%Highly Effective

Collective Bargaining Agreement: Proficiency Models Teachers of students in grades K-2 Percent of Students Scoring Proficient or Higher on ELA End of Year Exam Rating 0%-31%Unsatisfactory 31.01%-41%Needs Improvement/ Developing 41.01%-70%Effective 70.01%-100%Highly Effective

Collective Bargaining Agreement: Proficiency Models Teachers of AP or IB-assessed courses Percent of Students Scoring Proficient or Higher on Related AP or IB Exam Rating 0%-10%Unsatisfactory 10.01%-20%Needs Improvement/ Developing 20.01%-50%Effective 50.01%-100%Highly Effective

Collective Bargaining Agreement: Proficiency Models Teachers of Adult Students Assessed by TABE or CASAS Percent of Students Who Improve By 1+ Levels on TABE or By 5+ Points on CASAS Rating 0%-31%Unsatisfactory 31.01%-41%Needs Improvement/ Developing 41.01%-70%Effective 70.01%-100%Highly Effective