I NFUSING V ALUE -A DDED INTO THE SLO PROCESS August 2015.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

NYC Teacher Data Initiative: An introduction for Principals ESO Focus on Professional Development October 2008.
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Impact on Student Learning The conversation is about multiple indicators for this category BUT Few if any places actually have viable multiple indicators.
How Can Using Data Lead to School Improvement?
SASD DATA RETREAT Agenda Welcome Purpose and Outcomes of Day School Learning Objectives (SLO) Overview & Connection to Educator Effectiveness SLO.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
Principal Performance Evaluation System
Distinguishing Language Acquisition From Learning Disabilities April 24, 2014.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
ILLUMINATING COLLEGE AND CAREER READINESS State Report Cards for Districts and Schools.
Reporting college and career readiness results to the public DQC Public Reporting Task Force | January 9, 2014.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
A UGUST 2015 P RINCIPAL MEETING ~ Billie Finco & Sherri Torkelson ~
+ Equity Audit & Root Cause Analysis University of Mount Union.
Curriculum and Learning Omaha Public Schools
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
ILP Intervention Plans Tutorial. Intervention Plans in the ILP The Intervention Plan module was added to the ILP in May 2009 to meet requirements of SB.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
EE Coaches August Welcome! Introductions –Who you are –Where you are from –An EE success from last year.
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
New Jersey Assessment Of Skills and Knowledge Science 2015 Carmela Triglia.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Candidate Assessment of Performance CAP The Evidence Binder.
Candidate Assessment of Performance CAP The Evidence Binder.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
Using Student Perception Survey Results A Training for Principals This presentation is a template and should be customized to reflect the needs and context.
Examining Student Work Middle School Math Teachers District SIP Day January 27, 2016.
Supporting the Development of Student Learning Objectives How to Create an SLO.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
INSTRUCTIONAL LEADERSHIP TEAM CAMPUS IMPROVEMENT PLANNING MARCH 3, 2016.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
1 Testing Various Models in Support of Improving API Scores.
Review, Revise and Amend from Procedures for State Board Policy 74
TESTING: How We Measure Academic Achievement
EVAAS Overview.
CORE Academic Growth Model: Results Interpretation
Jean Scott & Logan Searcy July 22, MEGA
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
ILLUMINATING COLLEGE AND CAREER READINESS
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Starting Community Conversations
SGM Mid-Year Conference Gina Graham
State of Wisconsin School Report Cards Fall 2014 Results
Background This slide should be removed from the deck once the template is updated. During the 2019 Legislative Session, the Legislature updated a the.
Presentation transcript:

I NFUSING V ALUE -A DDED INTO THE SLO PROCESS August 2015

T URN AND T ALK : W HAT DO YOU KNOW ABOUT VALUE ADDED DATA ?

Page 1

Beginning of Year Working collaboratively with their evaluator or a peer, educators draw upon the SLO and Outcome Summary Process Guide (see page 2) to develop a minimum of one SLO. The development of the SLO now must include the review of teacher and principal value- added, as well as graduation rates or schoolwide reading value-added (as appropriate to the role of the educator). Educators continue to document the goal within the appropriate online data management system (e.g., Teachscape or MyLearningPlan). Collaborative learning-focused conversations are required as part of the process, but flexibility exists in whom educators collaborate with in Supporting Years. However, in Summary Years, educators must conduct this process with their evaluators.

Page 2

What is new or different from last year?

Page 3

TEACHERS  Teacher Value-Added and Schoolwide Reading: When developing SLOs, teachers must review individually, as well as with teacher teams at both the grade level and across the content area (e.g., schoolwide reading value-added), to identify trends (i.e., strengths and areas for growth) across time. These trends can inform SLOs or professional practice goals, based on areas of need. Working in teams with other teachers could inform the development of a team SLO that may align to a School Learning Objective identified by the principal. Value-added trends may also illuminate strategies that have worked well, based on areas of strength, and can support ongoing instructional efforts. Working in teams with other teachers could provide the opportunity to share best practices and successful strategies which support school improvement plans and/or goals. L ET ’ S WALK THROUGH THIS …

Graduation Rate: When developing SLOs, high school teachers must review graduation rate data across time to identify positive or negative trends regarding the matriculation of their school’s students. During this review, teachers should reflect on how their practice has supported the trends within the graduation rate data. Teachers should also review the data in vertical and horizontal teams to review school (and district) practices which positively and negatively impact graduation rates. This analysis can inform the development of SLOs, as well as professional practice goals, to support the improvement of graduation rates of the educator’s students. This review can also illuminate the success of various college and career ready strategies implemented by teachers and across the school to be modified or duplicated.

Educators are not required to develop a goal based on these data or to develop a goal with the intention to improve these data, unless the data indicates that is necessary. As always, the purpose of the Educator Effectiveness System is to provide information that is meaningful and supports each individual educator’s growth in their unique roles and contexts. By reviewing multiple data points, including those listed above, the educator has access to a more comprehensive view of their practice and a greater ability to identify areas of strength and need—both of which can inform the development of goals, as well as instructional/leadership strategies which can support progress towards goals. Note: Due to the lag in data provided by DPI to districts, as well as the date in the year in which the data is provided to the districts (i.e., the following year), educators should only use the data to review trends across time when developing an SLO. Educators should not use the data to score SLOs.

Our MISSION as educators is to improve teaching and learning.

M INDSET O F I MPROVEMENT “You don’t have to be sick to get better!”- Michael Josephson

M INDSET O F I MPROVEMENT Continuous Improvement is for EVERYONE

T HERE ARE 2 GENERAL WAYS TO LOOK AT STUDENT ASSESSMENT DATA 14 Attainment model - a “point in time” measure of student proficiency compares the measured proficiency rate with a predefined proficiency goal. Growth model – measures average gain in student scores from one year to the next accounts for the prior knowledge of students.

W HAT IS V ALUE -A DDED ? It is a type of growth model that measures the contribution of schooling to student performance on the WKCE in reading and in mathematics Uses statistical techniques to separate the impact of schooling from other factors that may influence growth Focuses on how much students improve on the WKCE (or our new assessment) from one year to the next as measured in scale score points 15

More Clear Data Picture Many data pieces give us a fuller picture… STAR WKCE or Badger AIMSweb ACT WorkKeys Classroom Assessments AP Surveys Aspire Observation Data PALS VA Data Why would we care about Value Added data?

VA allows for fairer growth comparisons to be made (in contrast to pure achievement ) 17 90% Proficiency School A School B 86% Proficiency 6% Free and Reduced 90% Free and Reduced

VA ALLOWS FOR FAIRER GROWTH COMPARISONS TO BE MADE ( IN CONTRAST TO PURE GROWTH ) We know that in Wisconsin, certain groups of students do not grow (or achieve) at the same rate as others. This can be due to the achievement level of a child (lowest students can grow the most) This can also be related to demographics such as Special Ed status ELL Race/ethnicity Economically Disadvantaged etc.

Hi! I’m a 4 th grade boy. I got a scale score of 2418 on my WKCE in reading this year! And these are all the other boys in WI who had the exact same scale score as me. 4 th grade

Now I’m in 5 th grade and just got a scale score of 2449 on my reading WKCE! I grew 31 points. All of the other boys took the test again, too. Their average scale score was Their growth was 25 points. 5 th grade

So we would say that my teachers in 4 th grade had a higher Value Add than would be expected. 5 th grade Average growth was 25 points I grew 31 points

Outside the school’s influence Race/Ethnicity Gender Section 504 Economic Status Disability (by type) Prior Year Score (reading and math) English Proficiency (by category level) Mobility U SING THE SAME PROCESS, VA C ONTROLS FOR THESE FACTORS

H OW DO THEY DECIDE WHAT TO CONTROL FOR ? Check 1: Is this factor outside the school or teacher’s influence?Check 2: Do we have reliable data?Check 3: If not, can we pick up the effect by proxy?Check 4: Does it increase the predictive power of the model?

C HECKING FOR U NDERSTANDING What would you tell a 5 th grade teacher who said they wanted to include the following in the Value-Added model for their results?: A. 5 th grade reading curriculum B. Their students’ attendance during 5 th grade C. Education level of the parents D. Student motivation Check 1: Is this factor outside the school or teacher’s influence?Check 2: Do we have reliable data?Check 3: If not, can we pick up the effect by proxy?Check 4: Does it increase the predictive power of the model?

R EPORTING V ALUE -A DDED In the latest generation of Value-Added reports, estimates are color coded based on statistical significance. This represents how confident we are about the effect of schools and teachers on student academic growth. Green and Blue results are areas of relative strength. Student growth is above average. Gray results are on track. In these areas, there was not enough data available to differentiate this result from average. Yellow and Red results are areas of relative weakness. Student growth is below average.

Grade Value-Added is displayed on a 1-5 scale for reporting purposes. About 95% of estimates will fall between 1 and 5 on the scale. Most results will be clustered around represents meeting predicted growth for your students. Since predictions are based on the actual performance of students in your state, 3.0 also represents the state average growth for students similar to yours. Numbers lower than 3.0 represent growth that did not meet prediction. Students are still learning, but at a rate slower than predicted. Numbers higher than 3.0 represent growth that beat prediction. Students are learning at a rate faster than predicted.

Grade % Confidence Interval 30 READING Value-Added estimates are provided with a confidence interval. Based on the data available for these thirty 4 th Grade Reading students, we are 95% confident that the true Value-Added lies between the endpoints of this confidence interval (between 3.2 and 4.4 in this example), with the most likely estimate being

C ONFIDENCE I NTERVALS Grade READING Grade 4 36 Grade Grade Grade 4 36 Grade MATH Color coding is based on the location of the confidence interval. The more student data available for analysis, the more confident we can be that growth trends were caused by the teacher or school (rather than random events).

L ET ’ S L OOK AT THE R EPORTS A VAILABLE IN S CHOOL A CCESS F ILE E XCHANGE (SAFE)

W E BEGIN WITH SOME CAVEATS ! VA is one data source among many that provides a different perspective on student growth. VA should never be the sole data source to identify effective/ineffective schooling! Taking VA out of the Student Outcome score allows each educator to decide how (or if) this data informs the SLO process.

T HIS NEXT SECTION IS INTENDED FOR YOU TO USE WITH YOUR OWN SCHOOL VA REPORTS. Y OU MAY CHOOSE TO USE YOUR SNIPPING TOOL TO INSERT YOUR SCHOOL DATA IN THE APPROPRIATE PLACE. P RINTING COLOR COPIES OF YOUR VA REPORTS FOR STAFF MIGHT ALSO BE HELPFUL

Page 1 Introduction to VA Color Coding

Page 2 With a partner: What are some observations you can make about this data as a school? How might teacher teams use this data?

Share your thinking

Let’s look at our VA as a school With a partner: What does this data suggest about how we are growing students in reading and in math? Use your snipping tool to insert a screenshot of the top section only of your own school VA report page 2 here…it will look something like this.

Share your thinking

Let’s look at our VA by grade With a partner: What does this data suggest about how we are growing students across grades in our school? Use your snipping tool to insert a screenshot of the bottom section only of your own school VA report page 2 here…will look something like this.

Share your thinking

Pages 3 & 4 With a partner: What are some observations you can make about this subgroup data? What questions do you have?

Let’s look at our reading VA with subgroups With a partner: How effective was our school in growing different groups of students in reading? Use your snipping tool to insert a screenshot of your own school VA report page 3 here…it will look something like this.

Let’s look at our math VA with subgroups With a partner: How effective was our school in growing different groups of students in math? Use your snipping tool to insert a screenshot of your own school VA report page 4 here…it will look something like this.

Share your thinking

Page 5 Introduction to VA Scatter Plots

With a partner: How might a teacher team use this data to identify an area of focus for their SLO?

Pages 6 & 7 Grade level VA and Achievement

Let’s look at our VA and achievement plotted together Use your snipping tool to insert a screenshot of your own school VA/Achievement scatter plots here…they will look something like this. If these are too small to see you may need to print colored copies for pairs or groups. With a partner: What stands out in our school data when we look at achievement and growth together?

Share your thinking

How do/don’t these reports add to our total data picture?

How might today’s learning apply to your own SLO?