DVAS Training 10-03-05 Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Standards-Based IEPs Aligning the IEP and Academic Content Standards to Improve Academic Achievement.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Upper Darby School District Growth Data
Pennsylvania Department of Education PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System.
Measures of Academic Progress (MAP) Curt Nath Director of Curriculum Ocean City School District.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
DATA-BASED DECISION MAKING USING STUDENT DATA-BECAUSE IT’S BEST PRACTICE & IT’S REQUIRED Dr. David D. Hampton Bowling Green State University.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
OCTEO INTRODUCTION TO VALUE-ADDED ANALYSIS October 25, 2012.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Standards Aligned System April 21, 2011 – In-Service.
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
Including a detailed description of the Colorado Growth Model 1.
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Value-Added: Then, Now & in the Future Ohio RttT Webinar Presented by Battelle for Kids June 7, 2011 Race to the Top.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Elementary & Middle School 2014 Mathematics MCAS Evaluation & Strategy.
OCM BOCES Day 7 Lead Evaluator Training 1. 2 Day Seven Agenda.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Update on Virginia’s Growth Measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education July-August.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Phase 2 Presentation.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
North Carolina Career Technical Education Assessments: Using Assessment and Data for Continuous Growth and Improvement Tammy Howard, PhD Director, Accountability.
THE DANIELSON FRAMEWORK. LEARNING TARGET I will be be able to identify to others the value of the classroom teacher, the Domains of the Danielson framework.
Moving Beyond Compliance: Four Ways States Can Support Districts and Local Data Use 2012 MIS Conference Dan Domagala, Colorado Department of Education.
TVAAS Tennessee Value-Added Assessment System
Setting the Context 10/26/2015 page 1. Getting Students READY The central focus of READY is improving student learning... by enabling and ensuring great.
The Power of Two: Achievement and Progress. The Achievement Lens Provides a measure of what students know and are able to do relative to the Ohio standards,
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
Implementation of Value-Added in an Urban School District Josephine Scott Sandra A. Stroot Dean Fowls.
Presented to: [District] Staff DATE RECOGNIZING EDUCATOR EXCELLENCE [insert district logo]
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
MEAP / MME New Cut Scores Gill Elementary February 2012.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
DISTRICT NAME HERE Using Student Growth Percentiles (Option A)
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
OREGON DEPARTMENT OF EDUCATION COSA LAW CONFERENCE 2015 ODE Update on Educator Effectiveness.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
PVAAS Overview: Evaluating Growth, Projecting Performance PVAAS Statewide Core Team Fall 2008.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
Teacher Evaluation Systems 2.0: What Have We Learned? EdWeek Webinar March 14, 2013 Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
NECAP Presentation for School Year March 26,
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
FAQS: PVAAS and Transition of PA’s Assessment System Keystones and the PA Common Core PSSA PAIU CC December 14, 2012 Kristen Lewald, PVAAS Statewide Director.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Using EVAAS to Make Data- Driven Decisions Clay County School April 20, 2012 Jan King Professional Development Consultant, NCDPI.
Lakeland Middle School Professional Learning Communities (PLC)
What is Value Added?.
Student Growth Measurements and Accountability
Teacher Evaluation “SLO 101”
New EVAAS Teacher Value-Added Reports for
EVAAS Overview.
“Reviewing Achievement and Value-Added Data”
Presentation transcript:

DVAS Training

Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual overview of value-added analysis View sample value-added reports See how value-added information fits with school improvement

Why are traditional achievement measures alone an insufficient way to assess student achievement? Rationale for Value-Added Progress Measures

The Changing Educational Landscape 2000’s: Standards-based education 1960’s: Mastery learning 1970’s: Behavioral objectives 1980’s: Minimum competencies 1990’s: Outcomes-based education

Looking at the changing educational landscape, a clear pattern exists: The focus has moved from what goes into a child’s education to what comes out of the process What’s the Upshot?

From Teaching Inputs Context (room, furniture, master schedule, course of study) Resources (number/quality of books, computers, materials) Capacities (knowledge of subject and teaching/learning processes, classroom control, lesson planning) To Learning Outputs State content standards and local curriculum aligned to standards Annual paper/pencil tests to measure achievement State report cards at district/building level Good teaching = High student performance The primary measure in “output focused” system is student scores on statewide achievement tests. The Shift in Focus

Stair-Step Expectations Most achievement measures imply: Achievement test scores are enough to show growth Students start at the same place Students progress at the same rate

Reality Students start at different places Students progress at different rates Educators need more than individual test scores to evaluate school’s impact on student learning

Need for Progress Measures To measure school effectiveness, we must pay attention to passage rates AND annual student progress

How do we maximize student progress each year, regardless of where they start? 85% of the public believes student progress is the best measure of a school’s effectiveness Question for Educators Today — Phi Delta Kappa/Gallup Poll 2005

Conceptual Overview of Value-Added Measures in Ohio How is performance data used to produce value-added information?

Two Value-Added Systems in Ohio Project SOAR Ohio’s Accountability System

Project SOAR Operated by Battelle for Kids Began in 2002 with 42 school districts Now has 106 districts and 3 charters schools Provides analysis in 5 subjects for grades 3-10 Uses state and non-state test data Uses a prediction based value-added approach Expected growth is normative (“Average Growth”)

Ohio’s System Operated by the Ohio Department of Education Begins as a 4th grade pilot in 2006 in all districts Provides analysis in math & reading in grades 4-8 Uses only state achievement tests Uses a mean gain value-added approach Expected growth is likely to be a fixed amount

What Do the Two Have in Common? Utilize the power of longitudinal data and linking together each student’s assessment data over time Compare students’ current test scores to baseline scores Provide value-added information in Web based reports Use the statistical power of EVAAS™ to produce the value-added analysis

Why EVAAS ™ in Ohio? EVAAS™ is a the value-added methodology pioneered by Dr. William Sanders of SAS Applies most sophisticated statistical methodologies available to ensure reliability Allows for the use of all student test data Provides valuable diagnostic information Approaches for handling different types of test data Identified by RAND and others as a preferred model Used statewide in Tennessee for 10 years

What is Value-Added? Value-added in simplest form is an accurate measure of the present (observed scores) minus an accurate measure of the past (baseline scores) for the same group of students. Mean Observed Score - Mean Baseline Score = Value-Added Math Scores Year 1 3rd Grade Student Student Student Student Student Year 2 4th Grade Student Student Student Student Student Mean Baseline Score 364Mean Observed Score would be a crude measure of the value-added

Why are Two Systems Needed? When both tests are on a common scale like the Ohio achievement tests, baseline can come from the prior year’s test scores (Mean Prior Score Approach - Ohio System) When the tests are on different scales the baseline must be calculated. (Mean Predicted Score Approach - Project SOAR) Sophisticated statistics are required in both approaches to ensure that all students’ data are included, that the information is reliable, and to add predictive diagnostic power. Different approaches are needed to provide reliable baseline scores from the different kinds of tests used in Ohio

What are Common Scales? Vertical Scales increase in equal intervals as you increase grade levels Horizontal Scales remain the same as you increase grade levels Score at the 3rd 4th 5th 6th 7th 8th 50th percentile Ohio Achievement Tests Proficient 3rd 4th 5th 6th 7th 8th Score When you have common scales, the prior years’ scores can be used as the baseline.

How Are Test Data Used When They Are Not on a Common Scale? All available test data is collected and linked for each student Districts are grouped into pools based on common testing histories Relationships between and among all tests in pool are used to create predicted baseline scores What students’ like them would typically score on this year’s test.

Collect all individual student data available for a minimum of three years Link each student’s annual test data together to create a longitudinal record How Do you Create a Longitudinal Record?

How Are Comparison Pools Created? Districts grouped into pools based on common testing histories at each grade-level cohort Example: Mike’s cohort testing history or pool

How Are the Relationships Between Tests Within a Pool Defined? Relationships between and among all tests in the pool are calculated and can be represented as a number or correlation. Example: using 2 years of and 4 years of Mike’s testing history

How Much Prior Data is Used? Up to 5 years of student test data and the relationships between tests are used to calculate predicted baseline score for this year’s subject-area tests.

How Are Predicted Scores Calculated? Using the test data for students with similar prior performance on common tests and the tests’ relationships to each other allows for the creation of statistically reliable predicted scores for each student in each subject Mike’s Prediction Student Prediction 2 Student Prediction 3 Student Prediction 4 Student Prediction 5 Student Prediction 6 Student Prediction 7 Student Prediction 8 Student Prediction 9 Student Prediction 10 Student Prediction 11 Student Prediction 12 Student Prediction 13 Student Prediction 14 Student Prediction 15 Student Prediction 16 Student Prediction 17 Student Prediction 18 Student Prediction 19 Student Prediction 20

Are All Students Used in the Analysis? Mike’s Prediction Student Prediction 2 Student Prediction 3 Student Prediction 4 Student Prediction 5 Student Prediction 6 Student Prediction 7 Student Prediction 8 Student Prediction 9 Student Prediction 10 Student Prediction 11 Student Prediction 12 Student Prediction 13 Student Prediction 14 Student Prediction 15 Student Prediction 16 Student Prediction 17 Student Prediction 18 Student Prediction 19 Student Prediction 20 Mean Predicted Score “Baseline” Mike’s Score Student Score 2 Student Score 3 Student Score 4 Student Score 5 Student Score 6 Student Score 7 Student Score 8 Student Score 9 Student Score 10 Student Score 11 Student Score 12 Student Score 13 Student Score 14 Student Score 15 Student Score 16 Student Score 17 Student Score 18 Student Score 19 Student Score 20 Mean Observes Score Your School Only students with enough prior data to create a predicted score are included.

Mean Student Score – Mean Predicted Score (with some statistical reliability factored in) = School Effect How Do you Estimate the School’s Effect on Student Growth?

Sample Value-Added Reports What information do value-added reports provide that was previously unavailable?

Achievement & Progress

High Achievement, High Progress High Achievement High Mean Scores = 89% passage High Progress Positive School Effects 2005 School Value-Added Report for OPT Math

High Achievement, Low Progress High Achievement High Mean Scores = 85% passage Low Progress Negative School Effects 2005 School Value-Added Report for OPT Math

Low Achievement, High Progress Low Achievement Low Mean Scores = 69% passage High Progress Positive School Effects 2005 School Value-Added Report for OPT Math

Student A Report

Student B Report

Student B Projection

Connecting to School Improvement Efforts How can value-added progress measures enhance school and district improvement at the: District Level School Level

District Improvement Efforts Identify patterns of progress across buildings, grade levels and subject areas Locate areas of strength to build upon Locate areas for improvement

District Value-Added Summary Report 2005 District Value-Added Summary Report 4th Grade

School Search Report 2005 School Search 4th Grade Math

School Improvement Efforts Identify patterns of progress across grade levels, subject areas and student subgroups Locate areas of strength to build upon Locate areas for improvement

School Value-Added Report 2005 School Value-Added Report Reading

Performance Diagnostic Report 2005 Performance Diagnostic for Reading 5th Grade Means

School Diagnostic Report 2005 School Diagnostic for Reading 4th Grade Means

Search for Students By Subgroup or Achievement Student Search

Identified At-Risk Students Student Search Results

In Summary, Value-Added Information Shows… How much progress students make in each subject area and grade level How much progress students at different previous achievement levels have made How students’ progress in one curricular area or program compares to their progress in another Whether individual students are making adequate progress to meet state standards Without data, all we have are opinions!

School Strategic Planning Cycle Pre-School Start of the Year Meetings Examine value-added and other school performance information by grade level and/or subject area Assess strengths and weaknesses and potential actions, grade level by grade level, subject by subject Celebrate progress Set 1-2 goals for each grade level, department, or team around strengths and weaknesses Create action plans and accountabilities Grade level, department and/or team meet to work on team specific goals Grade level, department and/or team meet to work on team specific goals Grade level, department and/or team meet to work on team specific goals

Select a lead person in the school district who understands value-added information, and can access, interpret and conduct presentations Train school leaders (principals and lead teachers) to access, interpret and conduct presentations Have school leaders share value-added information with school staff members Have school staffs use value-added information to assess annual progress and set goals for next year Implementation Checklist

Ohio’s Scale Up Plan

For more information, contact: www. BattelleforKids.org (866) KIDS-555