Presentation is loading. Please wait.

Presentation is loading. Please wait.

Washington State Teacher and Principal Evaluation Project Preparing Educators for Rater Agreement and Feedback: Planning for Professional Learning.

Similar presentations


Presentation on theme: "Washington State Teacher and Principal Evaluation Project Preparing Educators for Rater Agreement and Feedback: Planning for Professional Learning."— Presentation transcript:

1 Washington State Teacher and Principal Evaluation Project Preparing Educators for Rater Agreement and Feedback: Planning for Professional Learning

2  Face-to-Face Sessions  Student Growth 2.0  Rater Agreement Practices  TPEP/ Washington State Learning Standards Connections  A Virtual Presentation  Sharing Electronic Resources 2 TPEP Sessions for 2014-15

3  Pausing  Paraphrasing  Posing Questions  Putting Ideas on the Table  Providing Data  Paying Attention to Self and Others  Presuming Positive Intentions Group Norms 3

4 Professional learning is the underpinning of the evaluation system. TPEP Core Principles 4

5  I can describe the OSPI working definition of rater agreement and the stages for development.  I can identify best-practices and strategies to maximize rater agreement.  I can explain what makes a high quality professional learning process for evaluators and helps administrators develop strong skills in providing feedback.  I can describe our district’s plan to maximize rater agreement. Learning Targets: 5

6  Discuss: Using the survey tool, briefly discuss with your district team to determine the current level of practice in maximizing rater agreement.  Share: What is your districts identified area of focus? Activity 1: District Self-Assessment 6

7 Learning I can describe the OSPI working definition of rater agreement and the stages for development. 7

8  Turn and talk to a partner about the following topics:  I think rater agreement is important because…  My role in the rater agreement process is… Rater Agreement 8

9  The extent to which the scores between the raters have consistency and accuracy against predetermined standards. The predetermined standards are the instructional and leadership frameworks and rubrics that define the basis for summative criterion-level scores. OSPI Definition of Rater Agreement 9

10  Consistency: A measure of observer data quality indicating the extent to which an observer is assigning scores that agree with scores assigned to the same observation of practice by another typical observer.  Accuracy: A measure of observer data quality indicating the extent to which an observer is assigning scores that agree with scores assigned to the same observation by an expert rater; the extent to which rater’s scores agree with the true or “correct” score for the performance. Unwrapping the Definition… 10

11  Discuss with your team:  What systems are in place for your district to focus on accuracy?  Are these systems dependent on each other or could a district start with agreement and then work towards accuracy?  Share: What systems have you identified? 11 Take “Five”

12 Learning Content 1: OSPI Stages of Rater Agreement 12

13  Read “Rater Agreement in Washington State’s Evaluation System”  Discuss with your team:  How has your district communicated the vision of Rater Agreement?  What structures are in place for staff to learn about the Instructional Framework?  How do teachers and building principals use the language of the frameworks to improve practice?  Share: One strategy that you have identified. 13 OSPI Stages of Rater Agreement

14  Once we reach Stage 3, we do not “graduate.”  Rater agreement is NOT ensured by a single training or certification test.  We must revisit and review key learnings from Stage 1 and Stage 2 to avoid drift.  Provide ongoing professional development including ongoing calibration conversations involving real-life or video-based observation. Continuous Improvement Process: 14

15  Ongoing opportunities to demonstrate agreement  Access to practice videos for difficult-to-score domains/components  Expectations that their ratings will be monitored Rater Drift will naturally occur unless evaluators have: 15

16  Rating through observations requires evaluators to do a lot of things well. They must:  Be objective: Record evidence that is free of “bias, opinion, and subjectivity” (Bell et al., 2013)  Understand the rubric: Thoroughly and deeply understand each component and indicator on the district rubric instrument (Bell et al., 2013)  Interpret evidence: Correctly align evidence to framework criteria that reflect the context of the evidence (Bell et al., 2013)  Document evidence: Gather, sort, and record a preponderance of evidence as it happens in the classroom or school (McClellan, Atkinson, & Danielson, 2012).  Be reliable: Reliability requires that they be able to do all of these things well over-time, in different classrooms and schools, and for different educators. 16 How is Rater Agreement Assessed?

17 Professional Learning Takeaways 17 Ensure that observers have opportunities to learn the following:  Use the rubric language to explain their decisions.  Consistently take notes that gather useful evidence.  Avoid making scoring decisions during note-taking.  Resort back to the scoring criteria when uncertain.  Practice using videos and live classroom observations. (Bell et al., 2013)

18  What did you learn that confirmed your understanding of Rater Agreement in Washington State’s Evaluation System?  What did you learn that challenged your understanding of Rater Agreement in Washington State’s Evaluation System? 18 THINK – PAIR - SHARE

19 Learning Understand best-practices and strategies to maximize rater agreement. 19

20 20  An educator’s observation scores should be the same regardless of the observer.  Educators can trust the new evaluation system.  The new system is legally defensible for personnel decisions.  Educators receive relevant, useful feedback for professional growth. Why Rater Agreement?

21 1. Address issues of rater bias, interpretation, and objectivity (Jilliam, Kosa, Tierney, & Tocci, 2014; MET, 2012). 2. Rater agreement activities should be provided at least three times a year (Jilliam, Kosa, Tierney, & Tocci, 2014), but monthly is ideal (Jerald, 2011). 21 Two Research-based Rater Agreement Activities

22 What Affects Agreement in Observation? Observer Bias: What are some of the various “lenses” that might bias  A teacher evaluator?  A principal evaluator? 22

23 Turnkey Activity: Common Sources of Bias Handout 3: Common Sources of Bias Match-Up  Step 1. At your table, work as a group to match each common rater error with a possible strategy evaluators can use to avoid the error.  Step 2. After each match, discuss other possible strategies you have seen used or that you think might be effective. 23

24 Answer Key 1 = D 2 = G 3 = A 4 = F 5 = B 6 = E 7 = H 8 = C 24 Debrief 24

25 1. Address issues of rater bias, interpretation, and objectivity (Jilliam, Kosa, Tierney, & Tocci, 2014; MET, 2012). 2. Rater agreement activities should be provided at least three times a year (Jilliam, Kosa, Tierney, & Tocci, 2014), but monthly is ideal (Jerald, 2011). 25 Two Research-based Rater Agreement Activities

26  Rater agreement activities can be as long as several days of training and as short as 60-minute sessions. It’s important to note that frequency is the key to conducting the activities. Short and frequent could be just as effective as lengthier activities done only a few times per year. 60-minute rater agreement activities can be embedded in activities that already exist:  Monthly PLCs  Walk-throughs  Instructional rounds  Professional development 26 Rater Agreement Activities

27  Scaffolding Observations:  Group1: Management  Group 2: Environment  Group 3: Engagement  Count off table’ by “3’s” – be ready to gather evidence on the video based on your number Selecting Observation Focus for Data Collection 27

28  Ms. Warburton’s 8 th grade math lesson: Sorting and classifying equations  CCSS Math 8.EE.C.7a  Collect evidence statements https://www.teachingchannel.org/videos/sorting- classifying-equations-overview https://www.teachingchannel.org/videos/sorting- classifying-equations-overview 28 Video 1: Collect Data

29 Video 1:Share Data 29

30  Let’s all look at evidence that aligns with:  At tables: Based on the evidence we have compiled, talk about the level of performance that should be assigned. Data: Align with Framework Group Danielson MarzanoCEL 1 2 3 30

31  All opportunities should be grounded in common protocols or structures that assist in developing and extending the common language of the observation tools and other measures of educator practice. 31 Rater Agreement Activities (Jilliam, Kosa, Tierney, & Tocci, 2014

32  Discuss with your team:  How has your district included ongoing discussions to address bias in the evaluation?  What job-embedded strategies are in place for evaluators to practice specific skills?  Share: One strategy that you have identified. 32 Research-based Rater Agreement Activities

33 Learning High quality professional learning process for evaluators in providing feedback 33

34 34 “The post-conference cannot be treated as a bureaucratic formality; it is one of the most critical features of an effective teacher evaluation system if the goal is not just to measure the quality of teaching, but also to improve it.” ~Jerald and Van Hook, 2011, p. 23

35 Avoid Dominating the Conversation 35  A study of evaluation implementation in Chicago found that principals generally dominated the conversation by speaking 75 percent of the time in postobservation conferences (Sartain, Stoelinga, & Brown, 2011).  Encourage a balanced conversation (50/50) by asking reflective and follow-up questions. Ensure that teachers are prepared to participate and establish this as an expectation through educator orientation for the new evaluation system.

36 Reduces three big dangers in postobservation conferences:  Loose interpretation. Evidence-based feedback separates observations and interpretations.  Subjectivity. Drawing upon evidence during feedback conversations can decrease subjectivity (Sartain et al., 2011).  Emotion. Evidence-based feedback can also “remove some of the emotion from the evaluation process” (Sartain et al., 2011, p. 23). Focus on Evidence 36

37 Incorporating rubric language when discussing evidence helps in the following:  To build and reinforce a shared understanding of good instruction  To ensure the rubric remains the objective point of reference in the conversation Use Rubric Language and Descriptors 37

38 High-Level Questioning 38 In Chicago, researchers found that only 10 percent of questions asked by evaluators during postobservation conferences were high level and promoted discussions about instruction. (Sartain et al., 2011)

39 High-Level Questioning Rubric 39 RubricExample Low The evaluator’s question Requires limited teacher response—often a single word—rather than discussion Is generally focused on simple affirmation of principal perception “I think this was basic because of the evidence I collected. Do you agree?” Medium The evaluator’s question Requires a short teacher response Is generally focused on completion of tasks and requirements “Which goals did you not meet?” High The evaluator’s question Requires extensive teacher response Reflects high expectations and requires deep reflection about instructional practice Often prompts the teacher and evaluator to push each other’s interpretations “How did student engagement change in your class after you that strategy? Why do you think that happened?” (Modified from Sartain et al., 2011, p. 24)

40 Ends With Actions and Supports 40 Professional Growth Planning Action Strategies, Practice, and Modeling  Connect feedback to professional growth plan.  Identify goals, timelines, and benchmarks for areas for growth.  Ensure the conversation culminates in small, specific changes a teacher can implement in the classroom immediately.  Have the teacher practice or model the practice.  Suggest observing a colleague who strong in the area  Direct the teachers to additional resources (online, print, or other colleagues). (Hill & Grossman, 2013)

41 http://vimeo.com/89454466 Activity: A Good Conference or Not? 41

42 Helpful Resources for Coaching and Feedback 42  Learning-focused Supervision: Developing Professional Expertise in Standards-Driven Systems (Lipton & Wellman, 2013)  Principal Evaluator’s Toolkit for the Instructional Feedback Observation (American Institutes for Research, 2012)  Leveraging Leadership: A Practical Guide to Building Exceptional Schools (Bambrick-Santoyo, 2012)  The Art of Coaching: Effective Strategies for School Coaching (Aguilar, 2013)

43  Discuss with your team:  What strategies have you found successful in building a culture that embraces feedback and reflective conversations about teaching and learning?  What job-embedded strategies are in place for evaluators to practice specific skills?  Share: One strategy that you have identified. 43 Research-based Coaching & Feedback Activities

44 Resources. 44

45 45 Framework Specific Tools

46  North Mason School District  Viewed Video as group and collect Data/Evidence  Sorted and Placed Data/Evidence on framework  Determined Level of Performance  First Individually  With a Partner  Whole Group of Administrators (5-6) North Mason utilized this process for two years and then in year three they did classroom observations. They utilize “TeachScape” Videos. “Video” Protocol Example 46

47  Central Kitsap School District  Observational tool created to focus the observation on the framework: Purpose, Student Engagement and Academic Rigor  Teams entered one school and divided themselves  Observed in 15 minute time blocks  Met and talked about their ratings  Combined ratings into one common rating sheet  Checked for range and variations “Instructional Reviews” Protocol Example 47

48  A collection of 93 videos* for:  Exploring your instructional framework in the context of classroom instruction…  Discussing the characteristics of quality instruction as described by your framework…  Engaging groups in interpretation, analysis and discussion of classroom practice with the use of tools and rubrics. eVal 48

49  Tool to create internal video for use: 49 Swivel Video Camera

50  The URL for eVal Training Video: Debbie Tschirgi ESD112 http://vimeo.com/101444062 http://vimeo.com/101444062  The URL for the statewide Ed Tech web site where you can get the logins and URL for the eVAL Sandbox: http://www.edtech.wednet.edu/eVALTraining/ http://www.edtech.wednet.edu/eVALTraining/ 50 Additional Resources:

51 Implementing Develop long-range professional development based on the district’s current level of practice. 51

52  How will new evaluators receive training in the framework prior to evaluating staff?  How will new evaluators learn to apply the framework in a formative process during their first year?  How can existing meetings or professional learning communities be used to provide opportunities to maximize rater agreement?  What opportunities will be provided to trained evaluators to continuously hone their skills? 52 Plan for Rater Agreement

53  Initiating  Refining  Sustaining Rater Agreement: District Planning Tool 53

54  Assessment  Planning  Implementing  Reflection on Measuring District Planning Tool Each stage of Rater Agreement Includes: 54

55 55  Read “High Fidelity: Investing in Evaluation Training” (15 minutes)  How are states providing training for new teacher and principal evaluation systems?  How is your district providing support to monitor and maintain rater agreement?  Choose one strategy to add to your District Planning Tool. Learning Activity: Monitoring and Maintaining Rater Agreement

56 Reflecting 56

57  With a Partner:  Describe the OSPI working definition of rater agreement and the stages  Identify a best practice or strategy that could be used to maximize rater agreement.  Explain a strategy that would help administrators develop skills in providing feedback.  Discuss how your district plan includes elements listed above. 57 Revisiting the Learning Targets

58 58 What’s Next  Homework options:  I. Pilot with Administrative Team  II. Develop a Guidebook  III. Developing a Training Protocol

59 Thank you! Presenter Name XXX-XXX-XXXX xxxxxxxxxxx@xxx.xxx 1234 Street Address City, State 12345-1234 800-123-1234 59


Download ppt "Washington State Teacher and Principal Evaluation Project Preparing Educators for Rater Agreement and Feedback: Planning for Professional Learning."

Similar presentations


Ads by Google