Www.engageNY.org Teacher Effectiveness Research Network Team Institute January 2012 Amy McIntosh and Kate Gerson Senior Fellows, Regents Research Fund.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

Educational Consultant
1.
Mississippi Statewide Teacher Appraisal Rubric (M-STAR)
1 (c) 2008 The McGraw Hill Companies Redesigning Teacher Salary Structures School Finance: A Policy Perspective, 4e Chapter 12.
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
NCATS REDESIGN METHODOLOGY A Menu of Redesign Options Six Models for Course Redesign Five Principles of Successful Course Redesign Four Models for Assessing.
California Preschool Learning Foundations
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Career and College Readiness Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Assessment Literacy MODULE 1.
Assessment Literacy Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Career and College Readiness MODULE 1.
1 Citrus County 21 st CCLC Progress Report Larry Parman External Evaluator.
1 Alaska Developmental Profile Training Presentation.
Leading for High Performance. PKR, Inc., for Cedar Rapids 10/04 2 Everythings Up-to-Date in Cedar Rapids! Working at classroom, building, and district.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Guide to Compass Evaluations and
Difficult Conversations
The Framework for Teaching Charlotte Danielson
Georgia has Led the Nation for 3 Consecutive Years.
Effective Test Planning: Scope, Estimates, and Schedule Presented By: Shaun Bradshaw
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
NYC DOE – Office of Teacher Effectiveness A
1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012.
Copyright © 2010, SAS Institute Inc. All rights reserved. Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM Rigor Breakdown A Three Part Series.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
© 2012 National Heart Foundation of Australia. Slide 2.
District Advisory Council (DAC) 1 October 22, 2012 Westlawn Elementary School.
CLAS for TAFE Overview Information session for Riverina Institute TAFE staff wishing to use the tool James Worner, Senior Learning Design Officer Last.
How was your MAP ® experience?  As you get settled, tell us about your MAP experience.  Please add comments or questions to the graffiti wall. Use the.
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
EngageNY.org Practice and Coaching for Network Team Leads: Let’s Get Real NT/NTE Tuesday Evening Session February 5, 2013.
New Jersey School Districts Teachscape Reflect. Leona Jamison Teachscape Service Provider.
Student Survey
RTI Implementer Webinar Series: Establishing a Screening Process
Why were PERA and SB7 passed? What will be the consequences? Dr. Richard Voltz, Associate Director Illinois Association of School Administrators.
The Design and Implementation of Educator Evaluation Systems, Variability of Systems and the Role of a Theory of Action Rhode Island Lisa Foehr Rhode Island.
WEB IEP FOLLOW-UP ECO GATHERED FOR BIRTH TO 5 INCLUDING INFANT, TODDLER, PK 1.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Maths Counts Insights into Lesson Study
1 Phase III: Planning Action Developing Improvement Plans.
Speaking and Listening Instruction: Identifying and Using Effective Approaches to Facilitating Classroom Discussions December 9, 2014 Katanna Conley Michelle.
1 New Hampshire’s preK-16 Literacy Action Plan for the 21 st Century Deb Wiswell & Linda Stimson NH Literacy Task Force July 23, 2007.
Ed Fuller, PhD University Council for Educational Administration and
The Dynamic Trio of Effective Teaching Measures: Classroom observations, student surveys and achievement gains Thomas Kane Harvard Graduate School of Education.
4/4/2015Slide 1 SOLVING THE PROBLEM A one-sample t-test of a population mean requires that the variable be quantitative. A one-sample test of a population.
Profile. 1.Open an Internet web browser and type into the web browser address bar. 2.You will see a web page similar to the one on.
Educator Evaluation: A Protocol for Developing S.M.A.R.T. Goal Statements.
About the Child Care Workforce. 2 Presentation Overview The Institute The Myths The Truths The Research My Knowledge.
Examining Student Work. Ensuring Teacher Quality Leader's Resource Guide: Examining Student Work 2 Examining Student Work Explore looking at student work.
New York State District-wide Growth Goal Setting Process: Student Learning Objectives Webinar 2 (REVISED FEBRUARY 2012)
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data, Now What? Skills for Analyzing and Interpreting Data
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Recognizing Effective Teaching Thomas J. Kane Professor of Education and Economics Harvard Graduate School of Education.
Day 8. Agenda Aligning RTTT Growth and Value-Added Evidence Collection Inter-rater agreement and reliability Growth-Producing Feedback.
OCM BOCES Day 7 Lead Evaluator Training 1. 2 Day Seven Agenda.
OCM BOCES Day 4 Principal Evaluator Training 1. 2 Nine Components.
Day 8. Agenda Aligning RTTT Growth and Value-Added Evidence Collection Inter-rater agreement and reliability Growth-Producing Feedback.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
PREPARING [DISTRICT NAME] STUDENTS FOR COLLEGE & CAREER Setting a New Baseline for Success.
Learning about Teaching. The Measures of Effective Teaching Project Participating Teachers.
Teacher effectiveness. Kane, Rockoff and Staiger (2007)
Springfield Public Schools Springfield Effective Educator Development System Overview for Educators.
Gathering Feedback for Teaching Combining High-Quality Observations with Student Surveys and Achievement Gains.
Teacher Effectiveness Research
Presentation transcript:

Teacher Effectiveness Research Network Team Institute January 2012 Amy McIntosh and Kate Gerson Senior Fellows, Regents Research Fund All Materials from research studies described here are reprinted with permission of authors

2 Why Are We Here in Utica? Because teacher effectiveness matters

Tonight’s Agenda Discussion of new research studies that confirm: Teacher effectiveness does matter You are working on the right things. 3

Study Number 1: The Long-Term Impact of Teachers Any Questions? 4

Seriously: Study Number One The Long-Term Impacts of Teachers: Teacher Value-added and Student Outcomes in Adulthood (Chetty, Friedman & Rockoff). Study Data: 2.5 MM children from childhood to early adulthood in 1 large district Teacher/course linkages and test scores in grades 3-8 from US government tax data from W-2s: on parents AND students About parents: household income, retirement savings, home ownership, marriage, age when student born About students up to age 28: teen birth, college attendance, earnings, neighborhood “quality” 5

Key Finding: Teacher effectiveness matters Having a higher value-added teacher for even one year in grades 4-8, has substantial positive long-term impacts on a student’s life outcomes including: –Likelihood of attending college (UP 1.25%) –Likelihood of teen pregnancy (DOWN 1.25%) –Salary earned in lifetime (UP $25K per avg. student) –Neighborhood (More college grads live there) –Retirement savings (UP) 6

Key Finding: Student Future Earnings 7

What is “teacher value added” 8 A statistical measure of the growth of a teacher’s students that takes into account the differences in students across classrooms that school systems can measure but teachers can’t control. Value-added is: Growth compared to the average growth of similar students

Teacher Value-added is NOT: Test scores alone 9 Avg. Student Achievement (2015) Teacher ATeacher B th grade math Illustrative Scale Scores Achievement scores say more about students than teachers.

Teacher Value-added is not: growth in test scores alone 10 Avg Student Growth ( ) Growth +25 Growth Teacher ATeacher B Illustrative Scale Scores 2015 Adding average prior achievement for the same students shows Teacher B’s students had higher growth.

Teacher Value-added is: Growth compared to similar students 11 Avg Student Growth vs. Similar Students ( ) Value- Added +15 Above Average 660 Teacher ATeacher B Illustrative Scale Scores Avg for similar students Value- Added AVERAGE Growth +25 Growth Comparing growth to the average growth of “similar” students gives teacher A the higher “value-added” result.

Myth-busting REALITY: Some researchers say this. Others say it is the best way we have to identify the stronger and weaker teachers. THIS study adds new evidence to support that value-added measures DO measure real differences in the effect different teachers have on student learning. 12 MYTH: Lots of big research people say value- added isn’t reliable. You can’t really prove the teacher caused the change in scores

What do you think would happen: A high value-added teacher (top 5%) arrives in a new school to teach fourth grade: What happens to the new teacher’s kids’ fourth grade test scores? 13

The scores go up. 14

But what about? Maybe the “high value-added teacher’s” kids were all from high income families? Your model doesn’t measure that. The researchers thought of that, got the data and it doesn’t change the fact that having a high value-added teacher matters. Maybe “high value-added teachers” are always assigned to the higher achieving kids. They thought of that, got the data, and it doesn’t change the fact that (guess what)…... Maybe it’s just true for the top 5% of teachers. We can’t all be superstars. They thought of that (and guess what?) 15

What this study doesn’t answer Once teachers’ evaluation results depend on value- added, will their behavior change? Will they teach to the test? Will they cheat? Will they focus on data driven instruction, Common Core Standards and teacher practices that research says support student learning. What are the specific policy actions to take in a school district? How can you keep high value-added teachers in their schools? What professional development helps people get better? What about teachers who aren’t getting any better after 3 or 4 years? 16

Study Number Two: Measures of Effective Teaching 17

Study Number Two: Measures of Effective Teaching Unique project in many ways:  in the variety of indicators tested, 5 instruments for classroom observations Student surveys (Tripod Survey) Value-added on state tests  in its scale, 3,000 teachers 22,500 observation scores (7,500 lesson videos x 3 scores) trained observers 44,500 students completing surveys and supplemental assessments and in the variety of student outcomes studied. Gains on state math and ELA tests Gains on supplemental tests (BAM & SAT9 OE) Student-reported outcomes (effort and enjoyment in class) 18

19 Three Criteria: Predictive power: Which measure could most accurately identify teachers likely to have large gains when working with another group of students? Reliability: Which measures were most stable from section to section or year to year for a given teacher? Potential for Diagnostic Insight: Which have the potential to help a teacher see areas of practice needing improvement What measures relate best to student outcomes? Dynamic Trio

Measures have different strengths …and weaknesses 20 Dynamic Trio

Key Finding: Use multiple measures All the observation rubrics are positively associated with student achievement gains Using multiple observations per teacher is VERY important (and ideally multiple observers) The student feedback survey tested is ALSO positively associated with student achievement gains Combining observation measures, student feedback and value- added growth results on state tests was more reliable and a better predictor of a teacher’s value-added on State tests with a different cohort of students than: » Any Measure alone » Graduate degrees » Years of teaching experience Combining “measures” is also a strong predictor of student performance on other kinds of student tests. 21

Framework for Teaching (Danielson) 22 Four Steps

Student Feedback: related to student learning gains 23 Survey Statement Rank Students in this class treat the teacher with respect My classmates behave the way my teacher wants them to Our class stays busy and doesn’t waste time In this class, we learn a lot every day In this class, we learn to correct our mistakes Student survey items with strongest relationship to middle school math gains: 38 I have learned a lot this year about [the state test] 39 Getting ready for [the state test] takes a lot of time in our class Student survey items with the weakest relationship to middle school math gains: Note: Sorted by absolute value of correlation with student achievement gains. Drawn from “Learning about Teaching: Initial Findings from the Measures of Effective Teaching Project”. For a list of Tripod survey questions, see Appendix Table 1 in the Research Report.

24 Combining Observations with other measures improved predictive power Dynamic Trio

Compared to MA Degrees and Years of Experience, the Combined Measure Identifies Larger Differences 25 Compared to What?

26 Four Steps

Activity: Guidance to Practioners (page 2/3) 1.Choose an observation instrument that sets clear expectations. 2.Require observers to demonstrate accuracy before they rate teacher practice. 3.When high-stakes decisions are being made, multiple observations are necessary. 4.Track system-level reliability by double-scoring some teachers with impartial observers. 5.Combine observations with student achievement gains and student feedback. 6.Regularly verify that teachers with stronger observation scores also have stronger student achievement gains on average. 27

Districts with evaluation work in process The following Districts have been funded by the Gates foundation in connection with the METS project to implement teacher and leader effectiveness initiatives including new evaluation systems. Their public web sites tell more about how they are doing this. (Two others, Pittsburgh and Dallas, don’t have extensive information on their public sites.) Denver Public Schools LEAP: Hillsborough County, Florida Empowering Effective Teachers: Memphis, Tennessee Teacher Effectiveness Initiative: 28

How would you answer these common misconceptions? New York’s evaluation system is based mostly on State test scores and that’s not good. A principal knows a good teacher when s/he sees one; we don’t need to include value-added results too. I’ve been doing teacher observations for years. I don’t need to go to your training. Teacher Value-added information is unreliable and shouldn’t be a part of teacher evaluation. By putting test scores into teacher evaluation, everyone will do even more to “teach to the test” and if that doesn’t work, they’ll cheat. 29

How would you answer these common misconceptions? New York’s evaluation system is based mostly on State test scores and that’s not good. NY uses multiple measures as research advises. 60% involves measures of educator practice % involves GROWTH on state assessments or comparable measures. And the remaining points will be a locally-selected measure of student growth or achievement. A principal knows a good teacher when s/he sees one; we don’t need to include value-added results too. Recent METS study shows that combining observation results and teacher value- added is more predictive and reliable than either measure alone. I’ve been doing teacher observations for years. I don’t need to go to your training. The MET study shows that regularly recalibrating observers against benchmarks of accurate observation ratings is critical to ensuring a valid and reliable evaluation system. Even the best observers can “drift” over time. And the best can help others stay in sync. In addition, NYS training will help everyone identify evidence that the new Common core standards are being implemented well in classrooms. 30

How would you answer these common misconceptions? Teacher Value-added information is unreliable and shouldn’t be a part of teacher evaluation. Many researchers have shown that teacher value-added is the best predictor we have of the future learning growth of a teacher’s students. Two new research studies, Chetty/Friedman/Rockoff and the Measures of Effective Teaching Study add new evidence in support of this argument. By putting test scores into teacher evaluation, everyone will do even more to “teach to the test” and if that doesn’t work, they’ll cheat. No one has been able to research yet the predictiveness and reliability of teacher value-added measures when they are used in high stakes environments since such evaluation systems are just beginning across the country. Some teachers may try to game the system. Others may strive to develop the skills research says align with higher value-added results. However, the power of these measures argues for including them as part of a multiple measures system. 31

Thank You.