EVAAS Overview.

Slides:



Advertisements
Similar presentations
Measuring Growth Using the Normal Curve Equivalent
Advertisements

Summary of NCDPI Staff Development 12/4/12
Copyright © 2010, SAS Institute Inc. All rights reserved. Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
PVAAS (The Pennsylvania Value Added Assessment System )
PPT Presentation Template: This PPT includes all slides to present a district-level overview of PVAAS. This was used with a district-wide elementary faculty.
EVAAS, Teacher Effectiveness Reports, and Teacher Evaluation November 2012.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
Haywood County Schools February 20,2013
Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013 The Power of EVAAS.
Copyright © 2013, SAS Institute Inc. All rights reserved. NEW TEACHER REPORTS OCTOBER TRAINING 2013.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Chapter 3 Understanding Test Scores Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition.
Beginning – Intermediate October 16, 2012 EVAAS for Educators.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Phase 2 Presentation.
Getting Started Log on to this site: Complete Tasks Before We Begin Section 1.Consensogram Activity 2.Burning.
EVAAS for Educators Mary Keel, Ed.D. Robin Loflin Smith, Ed.D. Tara Patterson, MSA.
Beginning – Intermediate – Advance Date EVAAS for Educators.
Copyright © 2010, SAS Institute Inc. All rights reserved. EVAAS Concepts: NCEs and Standard Errors Sandy Horn January 2013 SAS ® EVAAS ® for K-12.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
PVAAS Public Data Release January January 20,
October 2012 Hyde County. Before We Begin… Visit:  Add the Region 1.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
PVAAS Overview: Evaluating Growth, Projecting Performance PVAAS Statewide Core Team Fall 2008.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Understanding AzMERIT Results and Score Reporting An Overview.
CTE Directors April 11, 2013 Understanding EVAAS: Teacher Effectiveness Reporting.
Beginning – Intermediate October 16, 2012 WRESA Internet Access: Log into AB Tech No Password EVAAS for Educators.
Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013 The Power of EVAAS.
Beginning – Intermediate – Advance November 8, 2012 EVAAS for Educators.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
2016 Primary Assessment Update 27th September 2016
Analysing the Primary RAISE
WASL Science: Grade 10 Specific Title Slide for School.
SASEVAAS A Way of Measuring Schooling Influence
What is Value Added?.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY???
IT’S ALL ABOUT GROWTH!. Hemet Unified School District’s Use of Measures of Academic Progress (MAP)
Preliminary Analysis of EOG/EVOS Data – Greene County 2009,2010,2011
Reports: Pivot Table ©2015 SchoolCity, Inc. All rights reserved.
Beginning – Intermediate October 16, 2012 WRESA
New EVAAS Teacher Value-Added Reports for
FY17 Evaluation Overview: Student Performance Rating
Beginning – Intermediate – Advance September 28, 2012
Reflective Assessments
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Jackson County Schools March 11, 2013.
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
Proactive Assessments
Our Agenda Welcome, Introductions, Agenda Overview EVAAS and Data
October 2012 NCDPI NCRESA EVAAS Training Refresher Session.
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013.
Why should you care about the EVAAS Teacher Value Added Report?
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
The Classroom Analyst & Using Growth Models
Presentation transcript:

EVAAS Overview

SAS EVAAS Analyses What is EVAAS? LOOKING BACK Writing ACT End of Course End of Grade LOOKING AHEAD Planning for Students’ Needs: Student Projections to Future Tests LOOKING BACK Evaluating Schooling Effectiveness: Value Added & Diagnostic Reports Copyright © 2010, SAS Institute Inc. All rights reserved.

How can EVAAS help me? Improve the Education Program EVAAS: Looking Back Past Program Effectiveness Local Knowledge & Expertise EVAAS: Looking Ahead Incoming Student Needs EVAAS helps by allowing educators to: Analyze past program performance for trends Make informed projections for current and incoming students

Education Value Added Assessment System Answers the question of how effective a schooling experience is Produces reports that Predict student success Show the effects of schooling at particular schools Reveal patterns in subgroup performance EVAAS extracts data AFTER DPI collects data through the secure shell. DPI runs processes and checks for validity. Once DPI has completed their processes with the data, they present to the SBE. At this point, data is sent to EVAAS.

Changes in Reporting for 2012-13 2011-12 2012-13 Above Exceeds Expected Growth Not Detectably Different Meets Expected Growth Changes in reporting for 2012-13 2011-12 Color coding and descriptors Above (Green) – students in the district made significantly more progress in this subject than students in the average district in NC. Progress was at least two standard errors above average. NDD (Yellow) – Not Detectably Different from students in the average district. Less than two standard errors above average and no more than two standard errors below it. Below (Light Red) – students in the district made significantly less progress in this subject than students in the average district in NC. Progress was more than two standard errors below average. 2012-13 Color coding and descriptors Exceeds Expected Growth (Blue): Estimated mean NCE gain is above the growth standard by at least 2 standard errors. Meets Expected Growth (Green): Estimated mean NCE gain is below the growth standard by at most 2 standard errors but less than 2 standard error above it. Does Not Meet Expected Growth (Red): Estimated mean NCE gain is below the growth standard by more than 2 standard errors. The descriptors in EVAAS now match the Standard 6 Ratings. Below Does Not Meet Expected Growth

District Value Added Report Use to evaluate the overall effectiveness of a district on student progress Compares each district to the average district in the state for each subject tested in the given year Indicates how a district influences student progress in the tested subjects We will look at three kinds of reports - value-added, diagnostic, performance diagnostic - at the district level and review how to read them b/c this is the same way you will read your school data. The reports have elements in common and once you can interpret the district reports, you’ll be able to read your school reports easily.

Value-Added Reporting Use this report to evaluate the overall effectiveness of a school on student progress. The School Value Added Report compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. * Facilitator preference for the next few slides until break – use only power point or use live site to model – more questions pop up when using the live site

Value-Added Reporting Scores from the EOG tests are converted to State NCEs (Normal Curve Equivalent scores) for the purpose of these analyses. NCE scores have the advantage of being on an equal-interval scale, which allows for a comparison of students' academic attainment level across grades. NCE scores remain the same from year to year for students who make exactly one year of progress after one year of instruction, even though their raw scores would be different. Their NCE gain would be zero.

Value-Added Reporting The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state. Student achievement levels appear at the bottom of the report in the Estimated School Mean NCE Scores section. The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. Compare the estimated grade/year mean for a school to the NCE Base. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.

District Diagnostic Reports Use to identify patterns or trends of progress among students expected to score at different achievement levels Caution: subgroup means come from “a liberal statistical process” that is “less conservative than estimates of a district’s influence on student progress in the District Value Added Report”

Diagnostic Report Use this report to identify patterns or trends of progress among students expected to score at different achievement levels. This report is intended for diagnostic purposes only and should not be used for accountability. Explain that on this report the students are grouped into quintiles. Students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state. on the performance as it compares to similar students through out the state.

District Performance Diagnostic Reports Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests Students assigned to Projected Performance Levels based on their predicted scores Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level Click on the underlined number in the Mean or Nr of Students row for a subgroup to see the names of the students assigned to the subgroup Click on the % of Students for the current year or for Previous Cohort(s) to see the data in Pie Chart format. Mean Differences The Mean of the difference between the students’ observed test performance and their predicted performance appears for each Projected Performance Level, along with the Standard Error associated with the Mean. The Standard Error allows you to establish a confidence band around the Mean. A large negative mean indicates that students within a group made less progress than expected. A large positive mean indicates that students within a group made more progress than expected. A mean of approximately 0.0 indicates that a group has progressed at an average rate in the given subject. When the means among groups vary markedly, districts may want to explore ways to improve the instruction for students making less progress.

District Performance Diagnostic Reports The Reference Line in the table indicates the gain necessary for students in each Prior-Achievement Subgroup to make expected progress, and it reflects the growth standard. When Gain is reported in NCEs, as it is here, the growth standard is 0.0. The Gain is a measure of the relative progress of the school's students in each Prior-Achievement Subgroup compared to the Growth Standard Standard errors appear beneath the Gain for each Prior-Achievement Subgroup. The standard error allows the user to establish a confidence band around the estimate. The smaller the number of students, the larger the standard error. A student becomes a member of a Prior-Achievement Subgroup based on the average of his or her current and previous year NCE scores. A single student score contains measurement error. Using the average of two years allows a more appropriate assignment. The ro. of Students row shows the number of students in a subgroup. Some subgroups may contain more students than others because students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state.

Interpreting the Pie Chart Green Yellow Light Red The Pie Chart shows the percent of students in each subgroup and compares their progress to the Growth Standard. Yellow: students in this group progressed at a rate similar to that of students in the average district in the state. Light Red: students in the group made more than one standard error less progress in this subject than students in the average district in the state. Green: the progress of students in this group was more than one standard error above that of students in the average district in the state.