Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.

Similar presentations


Presentation on theme: "Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment."— Presentation transcript:

1 Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment Reports

2 By the end of this presentation, participants will be able to do the following:  Define vendor assessments within the context of teacher evaluation.  Describe how vendor assessment growth scores are calculated.  Identify next steps for using vendor assessment results. Outcomes 2

3 Vendor assessments have the following characteristics:  They are created by an outside company or organization.  They have been vetted by the Ohio Department of Education (ODE).  They are used to measure student growth. What Are Vendor Assessments? 3

4 The Cleveland Metropolitan School District (CMSD) uses the following vendor assessments:  STAR assessment  The Measures of Academic Progress (MAP) from the Northwest Evaluation Association (NWEA)  ACT  Quality Core U.S. History Note: Not all teachers who administer these assessments will have vendor assessment scores. Only those teachers who do not receive value-added scores will receive vendor assessment scores. Which Vendor Assessments Does Cleveland Use? 4

5 The Multiple Measures of the Teacher Development and Evaluation System 5 Observations Measures of Student Growth Value-added scores Vendor assessments Student learning objectives (SLOs)

6 How Do Student Learning Objectives Fit Into the Teacher Development and Evaluation System? 6 Category B

7  Instead of receiving vendor assessment ratings based on the STAR assessment, teachers who taught Grades 2 and 3 last year will receive value-added scores based on MAP.  The state value-added vendor (SAS) used vendor assessment data to calculate value-added vendor assessment growth scores for Grades 2 and 3. Grade 2: Calculate growth using MAP scores from Grades 1 and 2. Grade 3: Calculate growth using MAP scores from Grade 2 and Ohio Achievement Assessment scores from Grade 3.  These calculations used prior year data. Policy Change: Teachers of Grades 2 and 3 English Language Arts 7

8  A value-added method (VAM) is a statistical method that aims to isolate teacher contributions to student learning across time.  Value-added scores are computed using standardized test scores from multiple years of test taking. What Is a Value-Added Method? 8

9 Value-Added Resources 9  ODE has created a series of resources for teachers to understand value-added resources, specifically SAS Education Value-Added Assessment System (EVAAS) methodology. Toolkits Online courses Report resources  Visit the ODE/Battelle for Kids website for more information: http://portal.battelleforkids.org/Ohio/Race_to_the_Top/rtttop_value- added_services/rtttop_value-added_resources/value- added_onlinecourses.html?sflang=en

10 Getting Ready for Teacher Value-Added Reports 10

11 https://ohiova.sas.com/unrestricted.download?ab=dn&as=c& yq=2&wy=HowTo_access_TR_v0.7.pdf How to Access Your Value-Added Score Based on Measures of Academic Progress 11

12  Most other Category B teachers will receive vendor assessment growth scores based on the STAR Early Literacy assessment or the STAR assessment. Exceptions: geometry teachers, science teachers, high school U.S. history teachers  Growth scores using the STAR assessment will be calculated using student growth percentiles (SGPs). Standardized Testing and Reporting Assessments 12

13  SGPs compare a student’s growth with the growth of students who started with the same or a similar score.  Based on where they fall in the distribution of student scores, students are ranked in percentiles of how much they have grown on a scale of 1–99.  An SGP of 56, for example, means that the student grew as much or more than 56 percent of his or her peers.  To calculate the teacher’s student growth rating, the vendor looks at the SGPs of all the teacher’s students and determines the group median SGP.  The group median SGP is then converted to a score of 1–5. How Are Vendor Assessment Ratings Calculated? 13

14 What Would This Teacher’s Median Student Growth Percentile Be? 14 Student NameStudent Growth Percentile Mari25 Max35 Marissa45 Mary56 Matthew60 Mark63 Melissa67

15 What Would This Teacher’s Median Student Growth Percentile Be? 15 Student NameStudent Growth Percentile Mari25 Max35 Marissa45 Mary56 Matthew60 Mark63 Melissa67

16 Vendor Assessment Ratings Based on Standardized Testing and Reporting Assessments 16 Group Median Student Growth Percentile Descriptive Rating Numerical Rating 81–99Most effective5 61–80 Above average effectiveness 4 41–60Average effectiveness3 21–40 Approaching average effectiveness 2 1–20Least effective1

17 Sample Report 17 http://doc.renlearn.com/kmnet/R004093913GG2E79.pdf#page=3

18 Sample Report 18

19 Reflection 19

20  Which students have high SGPs? Which students have low SGPs? Do you notice any patterns or trends?  Are the data from the STAR assessment consistent with the growth you have seen in your students from other assessments and student work throughout the year?  What feedback do you have for the district about how these assessments could be more valuable and meaningful? Self-Reflection for Teachers Receiving Growth Reports Using Standardized Testing and Reporting Assessments 20

21  How well does my vendor assessment rating align with other information I have about my students’ performance last year?  How does this rating compare with observational scores or SLO ratings?  Is my vendor assessment rating what I expected it to be? How might I adjust my instruction or support my colleagues to improve student growth in the future? Self-Reflection for Teachers 21

22  Overall, what picture do data from vendor assessments paint?  How do data from vendor assessments compare with data from state assessments and other sources of data?  In what subjects and grades did students show the most growth?  In what subjects and grades do I need to better support my teachers? Self-Reflection for Administrators 22

23  Learn more about each measure included in teacher evaluations as well as the summative rating process: Interpreting Value-Added Measure Reports Overview of the SLO Rating Process Determining Summative Ratings Other Webinars 23

24 Thank you. Presenter Name XXX-XXX-XXXX xxxxxxxxxxx@xxx 24


Download ppt "Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment."

Similar presentations


Ads by Google