Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.

Slides:



Advertisements
Similar presentations
District Determined Measures
Advertisements

AchieveNJ: Teacher Evaluation Scoring Guide
College and Career Ready Performance Index (CCRPI) The NEW Report Card in Georgia.
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Student Growth Percentile Model Question Answered
PROPOSED MULTIPLE MEASURES FOR TEACHER EFFECTIVENESS
Student Growth Percentiles For Classroom Teachers and Contributing Professionals KDE:OAA:3/28/2014:kd:rls 1.
Measures of Academic Progress (MAP) Curt Nath Director of Curriculum Ocean City School District.
99th Percentile 1st Percentile 50th Percentile What Do Percentiles Mean? Percentiles express the percentage of students that fall below a certain score.
PRESENTED TO THE BOARD OF EDUCATION AND PUBLIC APRIL 15, 2013 New Jersey Department of Education School Performance Reports.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
Student Impact Rating Teacher Professional Growth and Effectiveness System Daviess County Public Schools.
Student Learning Objectives: Considerations for Teachers of Career and Technical Education Courses Name Title Date 1 Copyright © 2014 American Institutes.
How Can Teacher Evaluation Be Connected to Student Achievement?
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Hull Public Schools MCAS Presentation Kathleen I. Tyrell, Ed.D., Superintendent of Schools October 7, 2013 mko.
Using Student Growth Percentiles in Educator Evaluations
Measuring Student Growth in Educator Evaluation Name of School.
Overall Training Objectives for TODAY Explain how appropriate STAR Reports assist with the Student Learning Objectives (SLOs) process Explain how to use.
STUDENT GROWTH MEASURES Condensed from ODE Teacher Training.
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
Setting the Context 10/26/2015 page 1. Getting Students READY The central focus of READY is improving student learning... by enabling and ensuring great.
Standards IV and VI. Possible Artifacts:  School Improvement Plan  School Improvement Team  North Carolina Teacher Working Conditions Survey  Student.
Student Growth Percentiles Basics Fall Outcomes Share information on the role of Category 1 assessments in evaluations Outline steps for districts.
The Teacher Evaluation and Professional Growth Program Module 6: Reflecting and Planning for Next Year December 2013.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Student Learning Objectives. Introductions Training Norms Be present Actively participate in activities Respect time boundaries Use electronics respectfully.
Future Ready Schools National Assessment of Educational Progress (NAEP) in North Carolina Wednesday, February 13, 2008 Auditorium III 8:30 – 9:30 a.m.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
OREGON DEPARTMENT OF EDUCATION COSA PRINCIPAL’S CONFERENCE 2015 ODE Update on Educator Effectiveness.
Welcome to MMS MAP DATA INFO NIGHT 2015.
OREGON DEPARTMENT OF EDUCATION COSA LAW CONFERENCE 2015 ODE Update on Educator Effectiveness.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST High School.
AchieveNJ: Principal and Assistant/ Vice Principal Evaluation Scoring Guide
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Examining Student Work Middle School Math Teachers District SIP Day January 27, 2016.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Best Practices in CMSD SLO Development A professional learning module for SLO developers and reviewers Copyright © 2015 American Institutes for Research.
{ SAGA & Teacher Evaluation Fall 2014 Student Academic Growth Assessment & the Teacher Evaluation System
Welcome. Outcomes  Learn to analyze growth as a catalyst for change  Understand the process to evaluate the effectiveness of instructional interventions.
Student Growth Measures in Teacher Evaluation: Writing SLOs August 2014 Presented by Aimee Kirsch.
Understanding Growth Targets and Target Adjustment Guidance for Student Learning Objectives Cleveland Metropolitan School District Copyright © 2014 American.
Student Growth Measures in Teacher Evaluation Module 4: Scoring an Individual SLO 1.
PRINCIPAL STATE GROWTH SCORES / Principal Performance/Visit= 50 Student Performance=50.
Somers Public Schools Building and Departmental Goals
Student Achievement Data Mount Olive Township Public Schools.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
TEACHNJ Proposed Regulations. TEACHNJ Regulations Proposal  Two Terms that are very important to know: SGO – Student Growth Objective (Created in District)
USING MAP DATA TO SET GOALS AND ENGAGE STUDENTS AND FAMILIES 7/14/16.
Understanding the Results Ye Tong, Ph.D. Pearson.
A Growth Measure for ALL Students.
Office of Educator Talent
What is MAP? There by, bringing about differentiated testing.
What is MAP? You may be familiar with paper and pencil assessments where all your child is asked the same questions and spend a fixed amount of time taking.
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
EVAAS Overview.
CORE Academic Growth Model: Results Interpretation
Proactive Assessments
AchieveNJ: Teacher Evaluation Scoring Guide
Irvington Public Schools
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
SGP What is it and where did it come from?.
Presentation transcript:

Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment Reports

By the end of this presentation, participants will be able to do the following:  Define vendor assessments within the context of teacher evaluation.  Describe how vendor assessment growth scores are calculated.  Identify next steps for using vendor assessment results. Outcomes 2

Vendor assessments have the following characteristics:  They are created by an outside company or organization.  They have been vetted by the Ohio Department of Education (ODE).  They are used to measure student growth. What Are Vendor Assessments? 3

The Cleveland Metropolitan School District (CMSD) uses the following vendor assessments:  STAR assessment  The Measures of Academic Progress (MAP) from the Northwest Evaluation Association (NWEA)  ACT  Quality Core U.S. History Note: Not all teachers who administer these assessments will have vendor assessment scores. Only those teachers who do not receive value-added scores will receive vendor assessment scores. Which Vendor Assessments Does Cleveland Use? 4

The Multiple Measures of the Teacher Development and Evaluation System 5 Observations Measures of Student Growth Value-added scores Vendor assessments Student learning objectives (SLOs)

How Do Student Learning Objectives Fit Into the Teacher Development and Evaluation System? 6 Category B

 Instead of receiving vendor assessment ratings based on the STAR assessment, teachers who taught Grades 2 and 3 last year will receive value-added scores based on MAP.  The state value-added vendor (SAS) used vendor assessment data to calculate value-added vendor assessment growth scores for Grades 2 and 3. Grade 2: Calculate growth using MAP scores from Grades 1 and 2. Grade 3: Calculate growth using MAP scores from Grade 2 and Ohio Achievement Assessment scores from Grade 3.  These calculations used prior year data. Policy Change: Teachers of Grades 2 and 3 English Language Arts 7

 A value-added method (VAM) is a statistical method that aims to isolate teacher contributions to student learning across time.  Value-added scores are computed using standardized test scores from multiple years of test taking. What Is a Value-Added Method? 8

Value-Added Resources 9  ODE has created a series of resources for teachers to understand value-added resources, specifically SAS Education Value-Added Assessment System (EVAAS) methodology. Toolkits Online courses Report resources  Visit the ODE/Battelle for Kids website for more information: added_services/rtttop_value-added_resources/value- added_onlinecourses.html?sflang=en

Getting Ready for Teacher Value-Added Reports 10

yq=2&wy=HowTo_access_TR_v0.7.pdf How to Access Your Value-Added Score Based on Measures of Academic Progress 11

 Most other Category B teachers will receive vendor assessment growth scores based on the STAR Early Literacy assessment or the STAR assessment. Exceptions: geometry teachers, science teachers, high school U.S. history teachers  Growth scores using the STAR assessment will be calculated using student growth percentiles (SGPs). Standardized Testing and Reporting Assessments 12

 SGPs compare a student’s growth with the growth of students who started with the same or a similar score.  Based on where they fall in the distribution of student scores, students are ranked in percentiles of how much they have grown on a scale of 1–99.  An SGP of 56, for example, means that the student grew as much or more than 56 percent of his or her peers.  To calculate the teacher’s student growth rating, the vendor looks at the SGPs of all the teacher’s students and determines the group median SGP.  The group median SGP is then converted to a score of 1–5. How Are Vendor Assessment Ratings Calculated? 13

What Would This Teacher’s Median Student Growth Percentile Be? 14 Student NameStudent Growth Percentile Mari25 Max35 Marissa45 Mary56 Matthew60 Mark63 Melissa67

What Would This Teacher’s Median Student Growth Percentile Be? 15 Student NameStudent Growth Percentile Mari25 Max35 Marissa45 Mary56 Matthew60 Mark63 Melissa67

Vendor Assessment Ratings Based on Standardized Testing and Reporting Assessments 16 Group Median Student Growth Percentile Descriptive Rating Numerical Rating 81–99Most effective5 61–80 Above average effectiveness 4 41–60Average effectiveness3 21–40 Approaching average effectiveness 2 1–20Least effective1

Sample Report 17

Sample Report 18

Reflection 19

 Which students have high SGPs? Which students have low SGPs? Do you notice any patterns or trends?  Are the data from the STAR assessment consistent with the growth you have seen in your students from other assessments and student work throughout the year?  What feedback do you have for the district about how these assessments could be more valuable and meaningful? Self-Reflection for Teachers Receiving Growth Reports Using Standardized Testing and Reporting Assessments 20

 How well does my vendor assessment rating align with other information I have about my students’ performance last year?  How does this rating compare with observational scores or SLO ratings?  Is my vendor assessment rating what I expected it to be? How might I adjust my instruction or support my colleagues to improve student growth in the future? Self-Reflection for Teachers 21

 Overall, what picture do data from vendor assessments paint?  How do data from vendor assessments compare with data from state assessments and other sources of data?  In what subjects and grades did students show the most growth?  In what subjects and grades do I need to better support my teachers? Self-Reflection for Administrators 22

 Learn more about each measure included in teacher evaluations as well as the summative rating process: Interpreting Value-Added Measure Reports Overview of the SLO Rating Process Determining Summative Ratings Other Webinars 23

Thank you. Presenter Name XXX-XXX-XXXX 24