Download presentation

Published byJairo Clemson Modified over 3 years ago

1
**Teacher Value-Added Reports State Board of Education January 14, 2014**

Connect for Success Conference 2013 Teacher Value-Added Reports State Board of Education January 14, 2014 Jamie Meade Managing Director, Strategic Measures Battelle for Kids

2
Session Objectives Provide value-added information relevant to the practitioner’s perspective. What should teachers and administrators know and understand about value-added reporting? How may educators use value-added information to improve professional practice and impact student academic achievement and progress?

3
Value-Added in Ohio For over 10 years, Battelle for Kids has provided support for professional learning and fostered collegial dialogue for understanding and using value-added measures. Battelle for Kids continues to advocate for the use of value-added measures, in combination with other educational measures, to improve practice and accelerate student academic progress.

4
**Ohio’s Value-Added History**

2002: Battelle for Kids’ SOAR Collaborative School and District Value-Added Reports 2006: Battelle for Kids’ TCAP Project Teacher Value-Added Reports 2007: ODE Value-Added on Local Report Card 2009: Battelle for Kids’ Ohio VA High Schools 2011: RttT: 30% Teachers, VA Reports 2012: RttT: 60% Teachers, VA Reports 2013: 4-8 Reading & Math Teachers, VA Reports 80 RttT Districts, K-3 R & M Teachers, VA Reports

5
**Value-Added in Ohio Educational Value Added Assessment System (EVAAS)**

SAS Analytics Customers in 135 countries More than 65,000 business, government and university sites SAS customers or their affiliates represent 90 of the top 100 companies on the 2012 FORTUNE Global 500® list.

6
**Value-Added Information in Practice: Building Awareness**

Connect for Success Conference 2013 Value-Added Information in Practice: Building Awareness Understanding the Difference Achievement Measures Progress / Growth Measures

7
**Achievement & Growth: Understanding the Difference**

Connect for Success Conference 2013 Achievement & Growth: Understanding the Difference Growth Achievement The Achievement Bar (blue bar) represents our expectations for what we expect students to know, understand, and be able to do at a specific point in time. Achievement is measured at a single point in time, usually with a single assessment. Perhaps, referenced as “passing” a test. As students enter our classrooms each year, we note that students enter the room with various levels of prior knowledge, skills, and understandings. (Here, click through the animation for the blue dots to represent students at their varying entry points). Then, discuss how the school’s or the teacher’s effect/impact on learning through a single achievement assessment cannot be determined by an achievement measure at a single point in time since students enter at various levels. Measures of student growth take into consideration where a student begins, and his/her historical assessment data (as available). Thus, multiple data points are used. Growth measures a student from point A to point B. Important Note: Value Added and other growth measures are NOT measuring whether or not a student passes a test, this is a common misunderstanding about Value Added and growth measures, in general. Remember, it’s not about “passing” a test.

8
**Measuring Growth is Important for ALL Students**

3rd 4th 5th 6th 7th 8th Jacob Proficient Adam Value-added provides a picture of student growth regardless of students’ achievement levels. Value-added can help us understand whether high-achieving students are making enough progress to sustain or even improve their achievement levels. Value-added can help us understand whether low-achieving students are making enough progress to close the proficiency gap. Talk about this slide from a perspective of a school with Jacob-like students or Adam-like students since value-added is a group effect versus individual student results. In this slide, Student A is currently above the proficiency bar, but is losing ground relative to proficiency. In this slide, Student B is not yet proficient, but is closing the gap on the proficiency bar. Grade

9
**ES/MS Data Team Session 1 2010-2011**

“Why can’t we simply compare OAA scaled scores from one year to the next to measure growth?” OAA Math Scaled Score Ranges May 2013 Level 3rd 4th 5th 6th 7th 8th Advanced Accelerated Proficient Basic Limited Notice the maximum scaled score and minimum scaled score are different at each grade level. Additionally, the maximum and minimum scaled score on each grade level test will vary from year to year. This means, measuring progress through the scaled score is not possible because the scales across grade level tests are not vertically linked. Although a common practice among Ohio educators (and some parents), a student’s increase in scaled score from one school year to the next does not indicate student progress. Likewise, a decrease in the scaled score does not indicate a lack of progress. For example, a score of 596 in grade 6 and then a score of 578 in grade 7 does not mean the student did not progress. Simply looking at scaled scores to detect student growth is quite deceiving and simply inaccurate. at scaled scores to detect student growth is quite deceiving and simply inaccurate. There are ___ points up for grabs. What is the maximum PI score….

10
Questions?

11
**Value-Added Information in Practice: SAS EVAAS MRM Model**

Connect for Success Conference 2013 Value-Added Information in Practice: SAS EVAAS MRM Model Mutlivariate Response Model (MRM) Grades 4–8 Reading and Math VA Reports Uses Ohio Achievement Assessments Reminder: This session is not about OTES/eTPES; this session is about exploring the “anatomy” of the Teacher Value Added Report so that you will be prepared to interpret the Report.

12
**SAS EVAAS Value-Added MRM Model***

Connect for Success Conference 2013 SAS EVAAS Value-Added MRM Model* Uses Grades 3–8 Reading and Math OAAs. Compares the average growth of students in the most recent year to the average growth of students in 2010 (state’s baseline year)* Growth expectation is defined as maintaining placement in the distribution of NCE scores from one year to the next* *conceptual definition

13
**Raw Score Scaled Score NCE**

Connect for Success Conference 2013 Raw Score Scaled Score NCE Sample Raw Score Range Scaled Score Normal Curve Equivalent NCE 52 551 247 99 1 Here, animations can be used to keep the participants focused on the message as you explain the sequential process for deriving NCEs from a student’s raw score. First, a raw score represents the points received from questions answered correctly. (MC = 1 pt. SA = 2 pts. ER = 4 pts.) The raw score is then converted to a 3-digit scaled score. The scaled scores on this sample, range from the minimum scaled score of 247 to the maximum SS of These scaled scores are then converted to an NCE. On Value Added reports, NCE are illustrated on a range from 1-99; it may be important to note that NCEs can range higher than 99, and is often the case. For simplicity, the EVAAS reports will convey a maximum NCE at 99. NCE = Normal Curve Equivalent; let’s discuss that more on the next slide.

14
**Scaled Scores Converted to NCEs in State’s 2010 Baseline Year**

DCS Value Added 11-12 Scaled Scores Converted to NCEs in State’s 2010 Baseline Year Conversion Values Are Fixed / Frozen Example 2010 Scaled Scores Rank Ordered 551 247 Normal Curve Equivalent (NCE) 99 1 Data Team Session 2

15
**Value-Added Terminology**

DCS Value Added 11-12 Value-Added Terminology Normal Curve Equivalent (NCE) The NCE is similar to a percentile rank in that scores are derived from scaled scores and ranked based upon performance. A significant difference between percentile rank and NCE is that an NCE scale is an equal interval scale. Data Team Session 2

16
**Normal Curve Equivalent (NCE)**

Distribution of Scores Percentile Equivalents A normal curve equivalent (NCE), indicates a student's rank compared to other students on the same test (similar to Percentile): NCEs run from 1 to 99 with 50 at the center of the base year distribution. BUT: Normal curve equivalents convert scaled scores to an equal-interval scale Since NCEs are represented on an equal-interval scale, scores can be averaged and compared longitudinally The NCE scale enables longitudinal data connections and the definition of a growth standard that does not change from year to year NCEs represent where a student’s score would place that student relative to student performance in the state’s base year 2010 for OH Normal Curve Equivalents

17
**Value-Added Terminology**

Connect for Success Conference 2013 Value-Added Terminology Baseline Score Group of students’ prior year mean NCE Example: Spring 2012 OAA mean NCE Observed Score Group of students’ new/most recent mean NCE Example: Spring 2013 OAA mean NCE Before we use the terms on the next slide, a brief explanation of these terms may be helpful.

18
**SAS EVAAS MRM Model Basic, Conceptual Example**

Connect for Success Conference 2013 SAS EVAAS MRM Model Basic, Conceptual Example Baseline OAA 2012 99th NCE Observed OAA 2013 99th NCE Expected Growth Maintain Placement in Distribution of Scores For example, a student at the 20th NCE must “at least” stay at the 20th NCE Student 3 92 NCE Student 3 89 NCE (-3) Student 2 67 NCE (+12) Student 2 55 NCE Student 1 20 NCE Student 1 20 NCE (0)

19
**Basic, Conceptual Example**

Scaled scores are converted to NCEs Grade 6 Baseline Grade 7 Observed 394 = 46 Student = 59 402 = 50 Student = 54 384 = 42 Student = 49 394 = 46 Student = 44 410 = 52 Student = 57 This is an example for teaching purposes only. EVAAS calculations are more statistically sophisticated to ensure that all students are included in the analysis and that confidence intervals reflect the entire history of student testing. The EVAAS methodology also allows future data to refine past data estimates for more accuracy. NCEs are Normal Curve Equivalents. The NCE scale enables longitudinal data connections and the definition of a growth standard that does not change from year-to-year. This model links student data from one year to the next. Mean Baseline = 47.2 Mean Observed = 52.6 Growth = Mean Observed – Mean Baseline Growth = 52.6 – 47.2 = 5.4 (Mean NCE Gain) A basic measure of the growth for this group is 5.4 NCEs

20
Questions?

21
**Levels of Value-Added Effects**

Connect for Success Conference 2013 Levels of Value-Added Effects Students are making substantially more progress than the state growth standard. Students are making more progress than the state growth standard. Students are making about the same amount of progress as the state growth standard. Students are making less progress than the state growth standard. Students are making substantially less progress than the state growth standard.

22
DCS Value Added 11-12 This slide illustrates the basic calculations of the mean gain model. For example, note the estimated district mean NCE for grade 3 in 2010 was 60.0 (Baseline Mean Score). In 2011, the same cohort of students had an estimated mean NCE of 61.2 (Observed Mean Score). The difference between the two (give / take the standard error doubled of .8) is the Value Added Mean Gain in the top half of the report. Data Team Session 2

23
**Connect for Success Conference 2013**

Standard Error A measure of the uncertainty All measures of student learning contain error. In the EVAAS teacher value-added report, the size of the standard error is influenced by N size (size of the student group). Missing scores.

24
**Connect for Success Conference 2013**

Year 1 Estimate = 1.5 True value lies somewhere within the range of the standard error 4.0 4 3 2 1 -1 -2 Estimate is Most likely value -1.0 Multi-yr. Ave. Y1 Y2 Y3

25
**Teacher Value-Added Report**

Connect for Success Conference 2013 Teacher Value-Added Report Note: Battelle for Kids is utilizing visual representations of copyrighted EVAAS® Web reporting software from SAS in this presentation for instructional purposes. Reminder: This session is not about OTES/eTPES; this session is about exploring the “anatomy” of the Teacher Value Added Report so that you will be prepared to interpret the Report.

26
**Value-Added Terminology**

ODE Connecting the Dots Conference Value-Added Terminology Growth Index Since the size of the standard error (degree of certainty) will vary across teachers, their estimated gain must be standardized to include both the estimate and the degree of certainty (standard error). Divides a teacher’s estimated gain by the associated standard error. Growth Index appears on the Teacher Report, but is not on the School or District Report. For this reason, this may be a new term for LEAs who have not yet received a Teacher VA report. Take the time to discuss this term before using it in the upcoming slides.

27
**The Teacher Value-Added Report**

Garilee Snapshot of the whole report – breakdown to follow.

28
**The Teacher Progress Table**

29
**Levels of Teacher Value-Added Effects**

Connect for Success Conference 2013 Levels of Teacher Value-Added Effects Most Effective Teacher's index: 2 or greater Students are making substantially more progress than the state growth standard. Above Average Teacher's index: equal to or greater than 1, but less than 2 Students are making more progress than the state growth standard. Average Teacher's index: equal to or greater than -1, but less than 1 Students are making about the same amount of progress as the state growth standard. Approaching Average Teacher's index: equal to or greater than -2, but less than -1 Students are making less progress than the state growth standard. Least Effective Teacher's index: less than -2 Students are making substantially less progress than the state growth standard.

30
**Foundational Statistics**

GR Ohio’s Academic Content Standards Mathematics: Statistical Methods 10 Interpret the relationship between two variables using multiple graphical displays and statistical measures (e.g., box-and-whisker plots and measures of center and spread). 8 Describe how the relative size of a sample compared to the target population affects the validity of predictions. Explain the mean’s sensitivity to extremes… 7 …describe how the inclusion and exclusion of outliers affect those measures. 6 Understand the different information provided by measures of center (mean, mode, median) and measures of spread (range). 1 Describe the likelihood of simple events as possible/impossible and more likely/less likely.

31
**Using Value-Added to Inform Practice**

Key Considerations: Systemic Programs, Delivery Models, Structures, Services, etc. Professional Learning Curriculum Alignment What I teach Assessment How I measure/monitor learning along the way Instruction How I teach

32
**Combining Measures to Inform Practice**

Connect for Success Conference 2013 Combining Measures to Inform Practice All measures of student learning contain error. No single measure can capture the complexity of learning and teaching. There’s an important distinction between a flawed measure and a flawed assumption based upon a single measure.

33
Questions? Thank You! BattelleforKids.org

Similar presentations

OK

Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.

Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on human chromosomes types Ppt on pin diode attenuator Ppt on special types of chromosomes in the karyotype Human liver anatomy and physiology ppt on cells Ppt on parallel lines Ppt on maggi product Ppt on tunnel diode operation Ppt on electricity and circuits Ppt on online shopping procedure Ppt on media revolution 7