OCTEO INTRODUCTION TO VALUE-ADDED ANALYSIS October 25, 2012.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

Measuring Growth Using the Normal Curve Equivalent
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
Copyright © 2010, SAS Institute Inc. All rights reserved. Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
FIVE KEY STRATEGIES FOR DEVELOPING HIGH-GROWTH SCHOOLS.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Copyright © 2013, SAS Institute Inc. All rights reserved. NEW TEACHER REPORTS OCTOBER TRAINING 2013.
Upper Darby School District Growth Data
Link Before You Leap Ohio RttT Webinar Presented by Battelle for Kids June 9, 2011 Race to the Top.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM A Story of Units Module Focus Grade 2- Module 4.
12 Ways MAP Data Can Be Used in a School. 12 Ways To Use MAP Data Monitor Academic Growth Using National Norms Identify Individual Reading Pathway using.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Critical Information SAGE Critical Information 1 Judy Park, Ed.D. Associate Superintendent Utah State Office of Education.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Value-Added: Then, Now & in the Future Ohio RttT Webinar Presented by Battelle for Kids June 7, 2011 Race to the Top.
Interim Joint Committee on Education June 11, 2012.
LOUISIANA STATE SUPERINTENDENT OF EDUCATION JOHN WHITE Tracking Readiness: Measuring High School Effectiveness in Louisiana National Conference on Student.
OAC Principals Collaborative Professional Learning Opportunities What can one teacher do? What can one administrator do? What can one school do? Fall Webinar.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Mathematics and Science Education U.S. Department of Education.
WW Why Evaluation?. Evaluation formalizes the shared responsibility of state and LEAs to improve student achievement and close the achievement gap in.
Copyright © 2013, SAS Institute Inc. All rights reserved. INCORPORATING STUDENT GROWTH IN ACCOUNTABILITY SYSTEMS: LESSONS LEARNED AND MIDCOURSE CORRECTIONS.
North Carolina Career Technical Education Assessments: Using Assessment and Data for Continuous Growth and Improvement Tammy Howard, PhD Director, Accountability.
Student Learning Objectives (SLOs) Measuring Teacher Effectiveness Through the Use of Student Data SLO Process – Step 5 Reviewing and Establishing a Summative.
STUDENT GROWTH MEASURES Condensed from ODE Teacher Training.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
STAR3 Project for WS/FCS. STAR3 All students deserve and thrive under a great teacher that cares for their well being. Our responsibility is to provide.
Copyright © 2010, SAS Institute Inc. All rights reserved. EVAAS Concepts: NCEs and Standard Errors Sandy Horn January 2013 SAS ® EVAAS ® for K-12.
MATRIX OF ACHIEVEMENT AND PROGRESS (MAAP) A New Interactive Data Tool for Ohio Districts.
EVALUATIONS FOR PROFESSIONAL GROWTH Ohio TIF and OTES.
Copyright © 2008 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
VALUE-ADDED AND THE OIP: What’s new and how do they fit together? October 26, 2011.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Building Capacity Using Value-Added in School Improvement Ohio RttT Webinar Presented by Battelle for Kids June 21, 2011 Race to the Top.
Race to the Top (RTTT) and the New York State Regents Reform Agenda Dr. Timothy T. Eagen Assistant Superintendent for Instruction & Curriculum South Huntington.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
PVAAS School Consultation Guide Fall 2010 Session C: 9-12 High School – All Data Tools PVAAS Statewide Core Team
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
CTE Directors April 11, 2013 Understanding EVAAS: Teacher Effectiveness Reporting.
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
EVAAS for Teachers: Overview and Teacher Reports Every Student READY.
Educator Effectiveness Process Introduction to the Grant and Guide to the Unit Meeting.
FAQS: PVAAS and Transition of PA’s Assessment System Keystones and the PA Common Core PSSA PAIU CC December 14, 2012 Kristen Lewald, PVAAS Statewide Director.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
Understanding Growth Targets and Target Adjustment Guidance for Student Learning Objectives Cleveland Metropolitan School District Copyright © 2014 American.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
2010 Value-Added Changes Technical Advances –Stabilization –Resetting of Baseline Reporting & Use Enhancements –5 Level System vs. 3 Levels –1 and 2 Year.
What is Value Added?.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY???
NWEA Measures of Academic Progress (MAP)
EVAAS Overview.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
Who We Are For more than 20 years, we have believed the key to preparing student for a successful future is providing rigorous and relevant instruction.
Why should you care about the EVAAS Teacher Value Added Report?
Implementing Race to the Top
Presentation transcript:

OCTEO INTRODUCTION TO VALUE-ADDED ANALYSIS October 25, 2012

© 2012, Battelle for Kids. All Rights Reserved. Dr. Mike Thomas Senior Director of Innovation Presenter

© 2012, Battelle for Kids. Understanding Value-Added Information Learning Targets:  The Ohio context  Understand where and how to access value- added tools and resources  Higher Education training opportunity  Understand value-added and diagnostic reports

© 2012, Battelle for Kids. All Rights Reserved. Common Core Standards PARCC Assessments Race to the Top Teacher Effectiveness Ohio Teacher Evaluation System Formative Instructional Practices Student Learning Objectives Value-Added Analysis College & Career Readiness Student Growth Measures Evidence of Student Learning Teacher-Based-Teams New Accountability System Performance-Based Compensation 21 st Century Skills Ohio Principal Evaluation System Ohio Educator Preparation Metrics Ohio Improvement Process Performance Assessments Doing Lesson Plans Keeping Students Busy Grading Homework Defending Student Grades Teacher Evaluation

© 2012, Battelle for Kids. The Ohio Context

© 2012, Battelle for Kids. Ohio Accountability System  District and School Accountability System is being reformulated  Ohio’s ESEA waiver is approved  Teacher Accountability—OTES  Principal Accountability—OPES

© 2012, Battelle for Kids. ESEA Waiver Approved (Highlights)  Implementation of rigorous standards, assessments and evaluations  Replacement of AYP  Cutting achievement gaps in half in 6 years  Change rating system to A-F system  Freeing schools from some federal reporting requirements

© 2012, Battelle for Kids.  > “Teaching” tab > Ohio Educator Standards > Educator Evaluation Systems in Ohio

© 2012, Battelle for Kids. Student Growth Measures

© 2012, Battelle for Kids. OTES Look-Up Table, End Result Must be One of Three SGM Categories Above Average Average Approaching Average Most Effective Least Effective EVAAS ® Teacher Value- Added Report Categories

© 2012, Battelle for Kids. Ohio Principal Evaluation System

© 2012, Battelle for Kids. Value-Added Impacts All Stakeholders Higher Ed Teachers Students Improve quality of teacher preparation program Value-added is a required component of the educator licensure program Improve quality of instruction Value-added information can improve areas of strength and areas of challenge Improve student achievement Value-added information can help identify student needs Value-Added Metric Impacts Oho Teacher Evaluation Value-Added Metric Impacts Authorization to Offer Teacher Prep Program

© 2012, Battelle for Kids. Key Statewide Deliverables of RttT  Teacher Value-Added Reporting  30% of LEAs link in Year 1 RttT (reports received fall 2011) primarily LEAs in Battelle for Kids’ expanded value-added reports projects along with some SIG schools  60% of all LEAs link in Year 2 (represents RttT LEAs)  100% of all LEAs in Ohio link in Years 3 & 4  Requires teacher linkage each spring to verify teacher assignments and teachers’ instructional time with students  Professional development and resources will address the use of value-added for school improvement and implications of teacher-level reporting

© 2012, Battelle for Kids. Ohio Regional Fall Workshops 2011 and 2012

© 2012, Battelle for Kids. Online Courses Completed

© 2012, Battelle for Kids. Link/Roster Verification

© 2012, Battelle for Kids. Resources to Support Value-Added in Ohio

© 2012, Battelle for Kids. Ohio’s Value-Added Network of Support: VALs & DVALs  Value-Added Leaders (VALs): 90 VALs who support district/community school value-added teams  District Value-Added Leaders (DVALs): On average, 3-5 person team from districts/community schools who provide support to principals and teachers in the use of value-added information 1.Go to the Ohio Student Progress Portal, Choose “Value-Added Network of Support” from Quicklinks 3.Utilize “Find your VAL/DVAL” feature to contact your local support system.

© 2012, Battelle for Kids. Helpful Tools For Your Use  EVAAS ® Interactive Site – Diagnostic Tool  BFK Ohio Student Progress Portal,  Online Courses via OhioLearn: value-added and FIP  Focus Guides, A system of continuous improvement  Understanding & Using Value-Added Analysis Toolkit  VA Book: How to Use Value-Added to Improve Student Learning

© 2012, Battelle for Kids. Resources Available to You  December 13 Value-Added Training  A.M. Session (8:30 – Noon) — OCLC Lakeview Room  Value-Added in the Ohio Context  Using Value-Added for School Improvement  Working Lunch  Getting access to resources  Value-Added Toolkit  How to Use Value-Added Analysis to Improve Student Learning: A Field Guide for School and District Leaders (Corwin)  Access to online courses  Focus Guides  Syllabus ideas  P.M Session (12:45-3:30)  John White from SAS EVAAS  Value-Added Modeling for Ohio  Registration   Click on Education in Ohio

© 2012, Battelle for Kids. Understanding Key Value-Added Reports

© 2012, Battelle for Kids.  We must expect progress for ALL students Jacob-like Adam-like Standard Grade Proficiency Why Value-Added is Necessary

© 2012, Battelle for Kids. Achievement and Progress

© 2012, Battelle for Kids. School Performance and Poverty Level – Math Battelle for Kids is utilizing visual representations of copyrighted EVAAS® Web reporting software from SAS Institute Inc. in this document for instructional purposes.

© 2012, Battelle for Kids. School V-A Gains and Poverty Level – Math Battelle for Kids is utilizing visual representations of copyrighted EVAAS® Web reporting software from SAS Institute Inc. in this document for instructional purposes.

© 2012, Battelle for Kids.  How do value-added measures support what we know about schools? The Power of Two: Achievement & Progress Progress One Year’s Growth Achievement Test Results Standard Low Progress Low Achievement Low Progress High Achievement High Progress Low Achievement High Progress High Achievement School A School B School C School D School E School F School H School K School G School J

© 2012, Battelle for Kids. What is a Growth Measure?  “Growth, in its simplest form, is a comparison of the test results of a student or group of students between two points in time where a positive difference would imply growth.” — Excerpted from Selecting Growth Measures: A Guide for Educational Leaders, Battelle for Kids.

© 2012, Battelle for Kids.  Tests are the most convenient way to measure students’ achievement levels, but:  All measures have error  Students don’t always score where they should Guessing Cheating Other outside factors beyond a teacher’s control  Small numbers of students in some teachers’ classrooms make it even more difficult to produce a system that is fair Why is it so difficult to measure growth?

© 2012, Battelle for Kids.  Tests must possess the following three properties:  They must be highly correlated to curricular objectives  They must have sufficient stretch to differentiate student achievement levels at both the lower and higher ends  They must be sufficiently reliable Required Test Properties for Value-Added Analysis

© 2012, Battelle for Kids.  Follows individual students across time  Uses all available student test data  Students’ growth is compared to their own history  Estimates the school’s influence or “school effect” on a group of students  Growth expectations are a policy decision and can be fixed (pre-determined standard) or normative (compared to the pool)  Statistical models can accommodate various testing regimens EVAAS ® : Value-Added Analysis Overview

© 2012, Battelle for Kids. EVAAS ® Information Used for Diagnostic Purposes  Value-added measure  Group statistic – measures the impact schools and teachers have on a group of students  It’s about us, the adults  Tells us: Is this “program” working? For whom?  It’s about the past – fall report release represents the effects of the program the prior school year  Projection information  It’s about individual students  Tells us: To what extent are students on a positive trajectory?  It’s about the future – probability of future success

© 2012, Battelle for Kids. What are you going to see in your value-added reports?  Standard error calculations  Establishes confidence band or range of values plus or minus the most likely value  Scores represented on Normal Curve Equivalency scale versus scaled scores or percentiles  Provides equal intervals from 1 to 99  Allows for averaging over time

© 2012, Battelle for Kids. What is a Normal Curve Equivalent (NCE)? Normal Curve Equivalent Distribution of Scores Normal Curve Equivalents Percentile Equivalents

© 2012, Battelle for Kids. Conceptual Example  Scale scores are converted to NCEs 394 = 46Student 1430 = = 50Student 2417 = = 42Student 3400 = = 46Student 4390 = = 52Student 5425 = 57 Mean Baseline = 47.2 Grade 6 Baseline Grade 7 Observed Mean Observed = 52.6 A crude measure of the growth for this group is 5.4 NCEs Growth = Mean Observed – Mean Baseline Growth = 52.6 – 47.2 = 5.4 (Mean NCE Gain) The actual scale scores are adjusted based on prior testing to protect teachers from errors of measurement (on a given day, one test may not accurately reflect the students proficiency level)

© 2012, Battelle for Kids. Battelle for Kids is utilizing visual representations of copyrighted EVAAS ® Web reporting software from SAS Institute, Inc. in this presentation for instructional purposes. Important Notice

© 2012, Battelle for Kids. Key Reports School Value-Added (MRM) Battelle for Kids is utilizing visual representations of copyrighted EVAAS ® web reporting software from SAS Institute Inc. in this document for instructional purposes.

© 2012, Battelle for Kids. Key Reports School Diagnostic (MRM)

© 2012, Battelle for Kids. Key Reports School Value-Added (URM)

© 2012, Battelle for Kids. Battelle for Kids is utilizing visual representations of copyrighted EVAAS® Web reporting software from SAS Institute Inc. in this document for instructional purposes. View Teacher Diagnostic Report View students linked/included in this report. Sample Teacher Value-Added Report

© 2012, Battelle for Kids. Teacher Value-Added Report (Aggregate portion of report) What do the levels mean?

© 2012, Battelle for Kids. Sample Teacher Diagnostic Report Battelle for Kids is utilizing visual representations of copyrighted EVAAS® Web reporting software from SAS Institute Inc. in this document for instructional purposes. View students in the first tertile

© 2012, Battelle for Kids. School Teacher Effectiveness Summary (Math)

© 2012, Battelle for Kids. Questions? Resource information found on the Ohio Student Progress Portal: Support Desk: or (866) Thank you!