Presentation is loading. Please wait.

Presentation is loading. Please wait.

Aligning Academic Review and Performance Evaluation (AARPE Part 2)

Similar presentations


Presentation on theme: "Aligning Academic Review and Performance Evaluation (AARPE Part 2)"— Presentation transcript:

1 Aligning Academic Review and Performance Evaluation (AARPE Part 2)
Virginia Department of Education Office of School Improvement Welcome to session 4 of AARPE Part 2 – Technical Assistance focused on Aligning Academic Review and Performance Evaluation. As always, we have a very full agenda planned for today. We will begin by revisiting your work on goal setting practices to increase student achievement by reviewing goals and providing feedback. Afterwards, we will return to Teacher Performance Evaluation Standard 4 and the related Principal Performance indicators by examining the look-fors that you developed for each sample performance indicator during session 3. You will refine those look-fors, making necessary additions, deletions, or modifications to more fully identify evidence of assessment of and for learning that might be seen or heard during a classroom observation. We will move next to the Inter-Rater Reliability observations that were conducted since our last session by teams consisting of principals, division leaders, and OPs. We will use a specific protocol to guide debriefing of that work, and we will note both the evidence collected and the feedback that resulted from those classroom visits. Additionally, you will have an opportunity to schedule the next round of IRR work. Today’s session will include a review of formative assessment processes and products that were introduced during Session 2. We will continue to examine samples of formative assessments (including objectives) to deepen our understanding of how assessment for learning and instruction are inextricably intertwined. This work is intended to assist you in providing feedback and guidance to teachers and leaders in your division. When teachers use the data collected during formative assessments to guide and adjust learning, student achievement is enhanced. As we go through the presentation, remember that our emphasis is not just on what the teacher is doing, but what students are doing: their learning experiences and the outcome of those learning experiences. The lesson observation reflects not only the teacher’s plan but the actual implementation of the lesson and the outcomes for students. We will continue to look for evidence that the teacher uses formative assessment during the lesson to identify students who are meeting learning goals and to intervene and close learning gaps for those students who are struggling. Resulting evidence-based feedback is a critical component of lesson observation and leadership. Aligning Academic Review and Performance Evaluation (AARPE Part 2) Session 4

2 Technical Assistance 2015-16 (slide 1 of 5)
Overview and Purpose To improve instruction and instructional leadership practices by strengthening the alignment between the Performance Standards for Teachers and Principals and the Lesson Planning, Lesson Observation, Professional Development, Leadership, and Assessment Academic Review Tools. To develop sample evidence (look-fors) for the sample performance indicators in Teacher Performance Standard 4, Assessment of and for Learning. The sample evidence for each performance indicator will become a tool that can enhance your division observation tools. (Division and School Leadership) We will review the two bullets. As a reminder, you have your own forms for walkthroughs, observations, and other feedback, and this work is not about changing the forms that you use. Bullet 1: Note the addition of the Assessment Academic Review Tool for this year as we focus on Standard 4, Assessment of and for Learning. Bullet 2: Just as we did for Teacher Performance Standards 1, 3, and 5 last year, we will develop evidence-based look-fors for Teacher Performance Standard 4 this year. We will look at the connections between academic review tools and performance indicators and use them as a reference going forward.

3 Technical Assistance 2015-16 (slide 2 of 5)
Overview and Purpose (continued) To develop sample evidence (look-fors) for the sample performance indicators in the Principal Performance Standards for those standards that link to Teacher Standard 4. The sample evidence for each performance indicator will become a tool that can enhance your division’s observation tools. (Division Leadership) To develop a common vocabulary and shared understanding of formative assessment. To enhance SOL data analysis skills to set meaningful goals at the school, grade/department, teacher, and principal levels. To enhance feedback skills of building and division leaders. Please review the four bullets. Bullet 1: Today we will examine indicators cited in Principal Performance Standard 1 that relate to Teacher Performance Evaluation Standard 4 and identify look-fors for principal’s practice. Bullet 2: Performance Standard 4 encompasses a multitude of types of assessments. We will focus our work this year on formative assessment. Bullets 3 and 4: We will continue to practice goal setting and feedback skills in this setting and in the school setting. We encourage you to continue to ask essential questions such as: What data were used to set this goal? How are these goals differentiated by teacher and teacher groups? When and how will progress be monitored? When and how will feedback be provided?

4 Technical Assistance 2015-16 (slide 3 of 5)
Outcomes Each participant will: Set clear and measurable goals for his/her work. Implement action steps with fidelity. Monitor and provide feedback. Adjust based upon monitoring and data. Repeat the cycle. We encourage each division to define their own outcomes, and we encourage you to follow this cycle. Reflection and evaluation are a big part of the outcomes, but we are not evaluating you. Your personal reflection and self-evaluation is important to the success of this technical assistance series.

5 Technical Assistance 2015-16 (slide 4 of 5)
Roles and Responsibilities Division staff will lead this professional development for principals of all second year accredited with warning schools. All OSI training materials will be available to district staff. Principals/division staff will use their own work as a starting point and will bring a variety of specific predetermined documents and work samples to each session throughout the year. Pending the availability of funds, principals, appropriate division staff, and state contractor will conduct inter-rater reliability walkthroughs and formal observations following sessions 3 and 4. Division staff will support and monitor principals’ delivery of professional development on the sets of sample evidence developed to appropriate school staff. Bullet 1 – Materials will be ed to one point of contact in each division to share with designated trainers. This is a part of the support for school principals that will be trained by division leaders here today. Bullet 2 – We will inform everyone at each session regarding the types of documents and/or work samples that will be needed at the following session. Once determined, we will consider this a training non-negotiable for building administrators and other division leaders. Bullet 3 – Following sessions 3 and 4, and pending funding availability, one day will be identified for the principal, designated division leader, and the state contractor to conduct walkthroughs and observations for inter-rater reliability. Calendar will be developed for the collaborative classroom observations/walkthroughs. Bullet 4 – Division staff will support (through scheduling and coaching). Calendar will be developed for this, and division staff will attend. The key to implementing this training is that we are all speaking the same language and using the same tools where appropriate. All school improvement work should be tied to training and performance standards. The division will be responsible for working with the LTP for the school(s) to ensure that their work complements and enhances this AARPE technical assistance. Any questions?

6 Technical Assistance 2015-16 (slide 5 of 5)
Professional Development Norms Arrive on time and stay the entire session. Be an active participant. (Complete in-session and between-session work.) Reserve school/division/personal business for established breaks. Maintain confidentiality. Use large sticky notes to indicate additional professional development needed/desired. Bullet 5 – Please note the location of large sticky notes posted in the room and small notes on each table. As we do this work, it will be important for you as leaders to identify what, if any, additional types of support/professional development is needed in order for participants to translate this into practice. Additional training might be for teachers in the building, principals, division leaders, OR any combination of these. For example, if you are still working with staff on unpacking standards as defined by the academic review tools, you may decide that additional training is needed. (Training that includes the real work and feedback.) If you are a new principal or division leader, you may need some training to “catch up” on the work done last year. It could be training that needs to be completed after this year’s AARPE training. This list will be ongoing and will provide us valuable information on learning “gaps” or “misconceptions”. Addressing the needs will vary depending upon what they are and what school, division, and state resources are available. We want to encourage each of you to be candid and honest in your responses. Doing so will help us all improve our skills and knowledge and build our capacity to help students achieve. Please use the sticky notes at your tables to let us know any additional needs. “In the moment” questions and answers may be recorded in order to assist all AARPE participants in the state. We will do our best; however, this should not stop anyone from asking questions. Not every “in the moment” question will be recorded.

7 Essential Questions How do the components and criteria of the selected academic review tools align with the Teacher and Principal Performance Evaluation Standards lesson observation tool? How do we use these tools for school improvement? How do we know we are making progress? What are the next steps? These essential questions grew out of academic reviews in Virginia’s academic review focused on the alignment of the written, taught, and tested curriculum with tools and supporting resources developed by Stronge and Associates. Last year, we focused on the Lesson Observations, Leadership, and Professional Development Academic Review Tools. This year we will add the Assessment Tool, as well as the teacher and principal performance standards related to assessment of and for instruction. Our work will revolve around these four essential questions. These steps are crucial to school improvement and must be given top priority in technical assistance for the schools in state and federal sanctions this year. Question 1: We examined the tools in AARPE and will quickly review them. Question 2: Big pieces of this include providing targeted leadership, professional development, and feedback. An interesting fact to note about priority schools across the state: On the annual Leading Indicator form that goes to USED, question 8 states: Distribution of teachers by performance level on LEA’s teacher evaluation system? Where do you think the vast majority of teachers in priority schools are rated? Question 3: After reading the question, chance for reflection and discussion. Question 4: As those of you who were with us last year know, we leave every session with identified Next Steps to accomplish before the subsequent session. Beginning with session 2, we start every session with a report out from each division on the progress that was made in accomplishing those next steps. We recommend this practice to you as you meet with your leadership teams and conclude professional development sessions.

8 Next Steps (1) (Completed by This Session)
Division leaders will meet with AARPE principals to develop plans for the principals’ delivery of professional development to appropriate school staff on (a) evidence-based look-fors that support Standard 4 and (b) the relationship between instructional objectives and formative assessment. Plans need to include date, time, draft agenda items and be incorporated under the appropriate indicator and task in the SIP. This and the following two slides show the Next Steps from October’s session.

9 Next Steps (2-3) (Completed by This Session)
Division staff will develop plans for AARPE professional development for all 2nd year warned schools. Plans need to include date, time, draft agenda items and be incorporated under the appropriate indicator and task in each school’s SIP. Completed agreed upon number of inter-rater reliability (IRR) walkthroughs and observations (Division leaders, principals, OPs).

10 Next Steps (4) (Completed by This Session)
Samples of: Lesson objectives Behavior, conditions, criteria Collected formative assessments Observations Feedback on Teacher Standard 4 Goals Principals: one teacher goal Division Leaders: one principal goal

11 The Goal-Setting Process: Setting Goals Monitoring Progress
Let’s return to how you have used summative data from assessments to inform goal-setting for your school, your teachers, your division, and your principals, and how you have monitored or might monitor progress for these goals.

12 Why Goal Setting? Increase student achievement.
Make explicit connections between teaching and learning. Make instructional decisions based upon student data. Provide a tool for school improvement. Research tells us that making instructional decisions based upon student data is the first step in school improvement. Whether you are goal-setting as a school division, school, department, grade level, principal, or teacher, it is necessary to focus attention on student results (data) in order to increase student achievement. We begin with data and continue by making explicit connections between teaching and learning. To change the learning outcome (student achievement), we must change instructional practices (input).

13 Using Data to Differentiate Goal Setting
How do I: Use data to set meaningful goals at the school, grade/department, teacher, and principal levels? Make a goal-setting session an evidence-based discussion? Monitor what happens in classrooms/schools to determine that the teacher/principal is moving toward goals/targets? We have noted that goals will likely vary by school, by grade level, by content area/instructional department, and by teacher and principal. This parallels the fact that goals for our students vary by individual and group, requiring evidence-based differentiation as well. As you worked with your teachers (or principals) to create goals for the school year, we trust that you asked and answered these questions for yourselves. 1. You have examined on-demand and published reports as valuable resources for identifying baseline data that could be useful when setting goals and targets. 2. We have recognized that the collection of evidence to inform our decisions and actions is critical to school improvement and increasing student achievement. Goal- setting conferences must include a review of the evidence (data) and must identify the evidence that will be collected to confirm whether or not progress is being made towards reaching the goal. 3. Most of us will agree that formative assessment is key to monitoring student progress. During the course of this training, we have discussed different types of assessments and the purposes and essential criteria for formative and summative assessments. We recognize that goal setting is a process. It is not an insert into a school improvement plan nor is it a teacher performance evaluation document written in September and filed away until June. It is not an exchange between a teacher or principal and their respective supervisor. It is a first step in the cycle of improvement that must be continually monitored for progress and adjusted as warranted.

14 Goal-Setting Process Analyze existing data to determine needs.
Meet with principal or teacher to create clear and measurable goals based on data. Implement, monitor, and provide feedback on teaching and learning strategies. Monitor student academic progress through ongoing formative assessment and adjust as warranted by data. Assess goal attainment. Did your goal setting process include these steps? Each step is critically important. You are encouraged to reflect upon these steps and ask yourself, “Did our goal setting process include these steps?”

15 Criteria to Consider When Establishing Goals
Data/evidence based? Differentiated by school, grade level, and/or individual teacher or principal performance? Rigorous and realistic? Specifies who does what by when? First bullet: Goals should provide baseline data (where we are now) that show the prior performance of the group or individual for which the goal is written. Baseline data points are used to determine needs and provide a starting point for improvement. Second bullet: Just as we differentiate our instructional “treatments” with students, we must also differentiate goals and targets for the educators we supervise. We understand that two different teachers (or principals) will have different areas of strength and challenge and, consequently, should have different goals for improvement. A “one-size-fits-all goal” approach may have a detrimental effect, demotivating some and lowering expectations for others. We want to create goals that influence and support everyone to grow. Third bullet: The goal must set high expectations and at the same time be realistic, i.e. be within the principal’s and/or teacher’s control to effect change. Fourth bullet: The goal must identify the means for attaining the goal, such as the activities or strategies that will be used to accomplish the goal, the individual or individuals responsible for implementing the activities or strategies, and the deadlines for completion. If a goal does not specify these pieces, it is not measurable. Division level leaders, from the goals that you develop, could your principals determine exactly what needs to be done, how they will do what needs to be done, who will do what by when, and how progress towards the goal attainment will be monitored? Principals, from the goals that you develop with your teachers, could they determine what needs to be done and who does what by when? Would your teachers know how progress will be monitored?

16 A Look at Goals Reread the goals that you brought with you today.
Examine them against the criteria on the previous slide. What changes would you make to the goal(s)? Work with a table partner to exchange and provide feedback on each other’s goals. Rewrite one or two of your goals based on your review and your partner’s feedback. We asked all of you to one goal that you have established for , and we thank you for completing that task. Look at the criteria from our previous slide. Consider the goal you selected in comparison to those criteria and answer these questions for yourselves: Was the goal based on data? What data provided the basis for this goal? Can you say that the goal is differentiated to the school, content/grade level, teacher or principal level? Is the goal rigorous? Is the goal realistic? Does the goal specify who does what by when? Think about how you might improve upon your goal. Now exchange with a table partner. Table partners should provide feedback and suggestions for improvement to each other. We will now return to individual work. Use your own assessments and the feedback provided by your table partner to rewrite the goal. Make any notes, adjustments, plans for actions that you will take as you move forward with monitoring goals for

17 Corners Activity Role of Principal Role of Group
Principal chooses one goal with which to model a goal-based conference. Principal shares his/her data, guidance, and feedback as if speaking to a teacher or group of teachers, including asking questions related to the goal. When finished, the principal listens as each group member offers feedback. Group members listen attentively and take notes as needed as principal shares the goal, guidance, and feedback. The VDOE OP will act as the teacher(s) and will respond to the principal’s questions. When the principal finishes, each group member (with OP speaking first to demonstrate) will offer verbal feedback. The round is completed when all group members have given feedback. We will group division leaders together and principals together for this activity. It is important for groups to be the same size with a maximum of 10 to a group. This may result in separating staff from a particular division, and that is ok. The time committed to this activity will be determined/adjusted by the facilitators based on the day’s schedule. It is critical that every effort is made to ensure that every principal or division leader has the opportunity to role play a goal-related conference using the goal that s/he brought to the session. Time permitting, at the end of the role plays and prior to the debriefing facilitators might ask participants to write personal reflections on a notecard. The following is a review of the activity’s directions, if needed: Many of you will remember the Corners Activity that we did in Session 5 of AARPE Part 1. With the Corners Activity, we are essentially creating a role play but with the principals’ goal setting conference as the script or context. The principal shares data and feedback and asks questions in front of the group. The contractor (VDOE OP) acts as the teacher in that he/she responds to questions. The OP’s responses should be those of a mid-range teacher, one who is good at some things but needs feedback on others. It is important for the principal to share all of the feedback he/she provided/will provide for the teacher or group of teachers. Once the “session” is complete, the OP will begin the feedback round for the principal. Each group member will then, subsequently, offer his/her feedback. The principal will not offer any comments but will, instead, simply listen. The round is complete when all of the group members have responded. The second round begins with the next principal beginning at step one by sharing his/her goal-setting conference data and feedback.

18 Debrief Corners Activity
Give specific examples of evidence-based feedback heard. List any strengths noticed during this activity. List any areas of improvement. Based on the feedback received, how do you plan to strengthen these areas? Develop personal Next Steps based on what was learned today. You have had an opportunity to model a goal-setting or progress- monitoring conference, receive feedback, and give feedback to other participants. Now, we ask that you reflect and debrief from this activity. Please complete steps 1-3 of this activity on a full sheet of paper for the facilitator to collect and review, and complete step four on a separate sheet of paper for you to keep.

19 Using a Standards-Based Observation Form
Now it is time to return to gathering evidence during observations that are grounded in the Teacher (and Principal) Performance Evaluation Standards as they relate to Assessment of and for Learning.

20 Main Areas for Evidence Collection During Observations
Standard 1: Professional Knowledge Standard 2: Instructional Planning Standard 3: Instructional Delivery Standard 4: Assessment of/for Learning (summative/formative assessment) Standard 5: Learning Environment Standard 6: Professionalism Standard 7: Student Academic Progress Here we see the familiar list of seven standards included in Virginia’s Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers. Teacher Standard 4 (Assessment of/for Learning) is our focus for technical assistance. In our previous session, we examined the seven sample performance indicators for Teacher Standard 4 and three selected indicators for Principal Standard 1.

21 Standard 4: Assessment of and for Learning
Enhancing Teacher Quality: Questioning Standard 4: Assessment of and for Learning Sample Performance Indicators: 4.1 Uses pre-assessment data to develop expectations for students, to differentiate instruction, and to document learning. 4.2 Involves students in setting learning goals and monitoring their own progress. 4.3 Uses a variety of assessment strategies and instruments that are valid and appropriate for the content and for the student population. 4.4 Aligns student assessment with established curriculum standards and benchmarks. 4.5 Uses assessment tools for both formative and summative purposes and uses grading practices that report final mastery in relationship to content goals and objectives. 4.6 Uses assessment tools for both formative and summative purposes to inform, guide, and adjust students’ learning. 4.7 Gives constructive and frequent feedback to students on their learning. This slide lists the indicators for Teacher Performance Standard 4: Assessment of and for Student Learning. The overarching standard states that “(t)he teacher systematically gathers, analyzes, and uses all relevant data to measure student academic progress, guide instructional content and delivery methods, and provide timely feedback to both students and parents throughout the school year.” For this standard, we are looking for evidence: what we see and hear during the classroom visit that confirms the teacher performs the actions specified in the indicators. In our last session, you began to identify look-fors by working together to create a list of what you would expect to see or hear for each indicator during a classroom observation. Look-fors provide examples of evidence that might confirm whether or not the teacher performs the actions specified in Standard 4 and its indicators. Those look-fors were a first draft. We hope you have found time since our last session to give some thought to the look-fors as you participated in observations and walkthroughs that are an essential component of your work. We now have an opportunity to re-examine those look-fors, vet them more thoroughly, and discern which are accurate and useful examples of either teacher or student behavior that may be seen or overheard during a classroom visit. Our next activity allows you to take a deeper dive into Teacher Standard 4 and to sharpen your list of look-fors. TQR Teacher Quality Resources, LLC (c) 2005

22 Standard 1: Instructional Leadership 1.1-1.5 (Principal)
Sample Performance Indicators: 1.1 Leads the collaborative development and sustainment of a compelling shared vision for educational improvement and works collaboratively with staff, students, parents, and other stakeholders to develop a mission and programs consistent with the division’s strategic plan. 1.2 Collaboratively plans, implements, supports, monitors, and evaluates instructional programs that enhance teaching and student academic progress, and lead to school improvement. 1.3 Analyzes current academic achievement data and instructional strategies to make appropriate educational decisions to improve classroom instruction, increase student achievement, and improve overall school effectiveness. 1.4 Possesses knowledge of research-based instructional best practices in the classroom. 1.5 Works collaboratively with staff to identify student needs and to design, revise, and monitor instruction to ensure effective delivery of the required curriculum. 1.6 Provides teachers with resources for the successful implementation of effective instructional strategies. . This slide and the one following identify the sample performance indicators for Principal Performance Evaluation Standard 1: Instructional Leadership. Note the indicators that are highlighted in red. These are the indicators that most closely align with Teacher Performance Evaluation Standard 4. In session 3, you identified look-fors for the highlighted indicators that focus attention on principal leadership in assessment and use of data that will impact improvements in instructional practice, student achievement, and overall school effectiveness. In this session, we will ask you to revisit the first draft of look-fors for principal practice, vetting the list of sample evidence and considering revisions, modifications, refinements, and/or additions to improve the original product.

23 Standard 1: Instructional Leadership 1.7-1.12 (Principal)
Sample Performance Indicators: 1.7 Monitors and evaluates the use of diagnostic, formative, and summative assessment to provide timely and accurate feedback to students and parents, and to inform instructional practices. 1.8 Provides collaborative leadership for the design and implementation of effective and efficient schedules that protect and maximize instructional time. 1.9 Provides the focus for continued learning of all members of the school community. Supports professional development and instructional practices that incorporate the use of achievement data and result in increased student progress. Participates in professional development alongside teachers when instructional strategies are being taught for future implementation. Demonstrates the importance of professional development by providing adequate time and resources for teachers and staff to participate in professional learning (i.e., peer observation, mentoring, coaching, study groups, learning teams). Evaluates the impact professional development has on the staff/school improvement and student academic progress. See indicators 1.7 and 1.10, highlighted in red.

24 Standard 4 and Related Principal Look-Fors
Steps 1 and 2 completed during Session 3. Facilitators share master list of look-fors from Session 3 and assign one indicator to each table. Individually, each participant will find the assigned indicator and prepare thoughts and notes for the discussion. Each table group discusses, validates, and comes to consensus for each identified look-for. The table recorder creates one unduplicated list and saves it to the flash drive. Table groups report out, and the whole group provides feedback. During Session 3, you used your own documents to begin creating a list of look-fors for Teacher Standard 4, Assessment of and for Learning, and for three indicators from Principal Standard 1. We concluded with table group work to establish a consolidated draft list of look-fors. Those look-fors were combined from all groups to generate a master list of look-fors, and that list was sent via to you prior to today’s session. Today, we will continue with the activity that was started in Session 3 by completing steps four through six. Step three: Your table group will be assigned a specific performance indicator for which each of you are responsible. This step provides time for each individual to review the consolidated list of look-fors for one indicator only, taking time to examine the look-fors critically, based on your own personal experience during observations. Ask yourself whether these look-fors represent what you have seen or might see a teacher or student doing or saying during the lesson. Are there items on the list that are duplicates or near-duplicates? Perhaps some could be combined, revised, or eliminated. Your role is to act as a critical but knowledgeable reviewer of the look- fors as you prepare to share feedback with your table group. Prepare your thoughts and notes for that discussion. Step three is to be completed individually so that each participant has an opportunity to gather his or her thoughts before sharing with the group. We want to be sure that each participant is able to contribute meaningful information based on his or her own valuable experiences. Step four: We are going to continue work now by reviewing and discussing these look-fors with your table group in order to confirm that the list contains clearly stated look-fors that meet our established criteria for evidence. Recognize that your statements must be concise and specific; this is not the place for generalities. In addition to the criteria we displayed earlier, ask yourselves these questions: (For reference, these questions appear on the next slide.) Is each statement specific? Is each statement concise and observable? Have we avoided generalities? Do we have look-fors that address both teacher and student behavior? Do we have look-fors that address what the indicator does look like as well as what it does not look like? Is this something I can touch? see? hear? Each group should identify a recorder to prepare the final list of look-fors based on group consensus. A second person should act as the reporter for step five. Step five: We have received all lists, and now we will begin the process of reporting out to the entire group. As your group’s reporter shares out your list, your group’s recorder should be making the agreed upon changes in your electronic list. This final list will be saved to the flash drive that will be circulated around the room. Now is the time to ask questions or offer suggestions for the final version of look-fors for these performance indicators. Our final product can only get better when we all apply ourselves to this task.

25 Look-For Criteria Is each statement specific?
Is each statement concise and observable? Have we avoided generalities? Do we have look-fors that address both teacher and student behavior? Do we have look-fors that address what the indicator does look like as well as what it does not look like? Is this something I can touch? see? hear? We will leave these questions on the screen to assist you as you work with your table to finalize the look-for list for your assigned indicator.

26 Inter-Rater Reliability Protocol
We will now move to our work on inter-rater reliability protocol. Later in the session, each IRR team will have the opportunity to debrief from its recent observations. First, we will review briefly the need for and how to establish reliability in observations.

27 Reliability in Observations
Common reliability issues in observations may include the lack of: Criterion reliability, Intra-evaluator reliability, and Inter-evaluator reliability. Last session we took a brief look at these three issues that negatively impact reliability in observations. With regard to the first bullet, we know that one of the ways to increase reliability during observations is to better understand the criteria against which the teacher or principal is being evaluated. Deeper knowledge of the performance evaluation standards and the look-fors for each indicator help to increase our own reliability as we observe others in their practice. In session 2, and again today, we invested time creating lists of look-fors based on Teacher Standard 4, with the goal of increasing our understanding of the criteria and enhancing the reliability of our observations. We have seen examples from sample observations and in our own work that remind us of the possibility of collecting idiosyncratic or irrelevant evidence during lesson observations. Time spent learning more about performance evaluation standards and the indicators that appear on observation documents is not wasted time. One outcome of this intentional work is a reduction in bias—therefore, we are less likely to focus on irrelevant evidence and we are more likely to provide clear, specific feedback in our written statements. We are going to skip over intra-rater reliability for now and focus our time on the third bullet: Inter-evaluator reliability, or the degree to which two or more observers of the same lesson agree upon what was seen and the evidence that was collected.

28 Establishing Inter-Evaluator Reliability
Calibrate evidence-collection during initial training Conduct tandem observations and performance reviews with multiple evaluators One way to establish inter-evaluator reliability is through reliability trainings that calibrate evaluators’ evidence collections during observations. After the initial training, the inter-evaluator reliability should be periodically re-established through tandem observations in which multiple evaluators observe the same lesson and compare the evidence they collected and how they interpreted it. This is the purpose for the Inter-Rater Reliability observations that you conducted in your schools over the last few weeks, in which teams of educators (principal, division leader, and OSI OP) observed real time classroom instruction together. Each team member looked for evidence of formative assessment occurring in the classroom. It is now time to analyze and compare our findings, looking for consistencies and inconsistencies in the evidence we collected, with the ultimate goal of increasing the consistency between the evidence-collection of two or more evaluators and thereby improving the reliability of our observations.

29 *Inter-Rater Reliability Protocol
Protocol Step 1: Individual: Using your Round 1 evidence documents that you brought today, note the specific look-fors you observed for Standard 4 for each of the three or four teachers. Use a highlighter or some other method to mark the look-fors right on your documents. (Refer to Standard 4 look-for documents.) Protocol Step 2: Partner with your observation team. Each member shares the look-fors observed (per teacher, per indicator). Discuss any differences (per indicator, per teacher). Each member completes the protocol tool pages 1-4. How closely did your written documentation align with the look-for documents? Protocol Step 3: On your protocol handout, pages 5-6, list specific questions that the principal could ask the teacher based upon the evidence you recorded during the observation. Be prepared to read the evidence-based observation statement that led to the question. Share your list with observation team. Record these questions on the chart paper provided, and post when completed. Protocol Step 4: Whole group debrief. Each group shares the following for Standards 4: a) differences found in look-fors for Standards 4, b) solutions to avoid these differences in Round 3, c) 2-3 of the feedback questions you’ve agreed upon and how these questions will lead to improved student achievement. When reporting, read the evidence-based statement first, then read the question that you would ask the teacher based upon the evidence you recorded during the observation. Step one is individual work, and it asks that you note specific look-fors you observed. In step two, you will partner with your observation teach to share the observed look- fors and discuss. Step three: Providing feedback following an observation is evidence-based work. Open-ended questions such as “how do you think it went?” may distract or divert attention from the purpose of the conference conversation, which is to provide specific examples of what you saw (or did not see) for the purpose of improving instructional practices. Step four is a debrief and includes three pieces of information to be shared. When reporting part c), read the evidence-based statement first, then read the question that you would ask the teacher.

30 Framework for Inter-Rater Reliability Work
Who: Principal, AARPE Key Division Leader, Assigned OP When: One day between AARPE Session 4 and AARPE Session 5 Day identified by the principal, AARPE Division Leader, and OP considered non-negotiable What: Three 30 minute formal observations (adjusted for November/December school calendars) Use school division’s forms Debriefing: Formal debriefing conducted at the next scheduled AARPE session We are using inter-rater reliability work to improve the reliability of our observations and to ensure that our feedback (verbal and/or narrative) is evidence-based. Remember that we use the term inter-rater rather than inter-evaluator because we are not evaluators of you or your staff. Our primary purposes are to: increase principal’s effectiveness, encourage principal feedback to teachers, and to provide the opportunity to monitor principals. The VDOE/OSI makes the following recommendations: Observations should last the entire block for formal observations. In lieu of the entire block, observations should last no less than 30 minutes, with 45 minutes preferred. The role of the OP is to offer additional observer feedback on the performance indicators and to encourage the use of our technical assistance learning. Ultimately, the observation team is focusing on the question: Is it really evidence? Your observation team includes the principal, a member of your division leadership team, and your assigned OP. You are expected to have a team observation on one day during the identified window of time. These observations are counted toward any requirement your division has established regarding teacher observation. We will debrief your observations at our next AARPE session. We are going to take time now to identify the specific dates outlined in number two and the specific number of observations/walkthroughs, as well as duration of each.

31 Time Management Plan ahead, establish, and publish (with appropriate staff) times for observing in classrooms. Discuss with appropriate division staff, gaining stakeholder support by making it known that there will be times the administrative staff is unavailable while observing instruction in the classroom. Chunk out time. Example: mornings on Wednesdays and Fridays, afternoons on Tuesdays, etc. Plan for distributed leadership. Who will be your designee for what? What is an acceptable reason to interrupt? What time management strategies are you planning for this year? Please review the information on this slide and recall it is important to protect this time for observations.

32 Formative Assessment In the next steps from last session, we asked that you bring an additional sampling of formative assessments. It may be that after our last session, your understanding of formative assessment changed or strengthened. Gathering formative assessments may have become easier, if you better understood what to look for, or it may have been more difficult, if you were looking to capture that minute-to-minute teacher and student interaction rather than collect a worksheet or a quiz to bring today. Before we begin working with the formative assessments you brought today, we will review the processes and products that may be considered useful formative assessments.

33 Formative Assessment: Processes and Products
Observation Conversations and Questions (questioning strategies, interviews, conferences) Self-evaluation (student reflections, rubrics) Artifacts (folder of work samples, chronological records) As school leaders and evaluators of instruction, during pre-conferences, observations, and walkthroughs you are looking for and collecting evidence related to the processes and products of formative assessment. You will recall from session 2 that there are several broad categories that inform our thinking about the processes and products of formative assessment. As teachers plan lessons and write objectives, they should decide how and when evidence of student learning may be gathered during the lesson. Aligned lesson objectives (with behavior, conditions, and success criteria clearly communicated to students) are one opportunity that teachers can use to gather data on student progress toward the learning goals. Learning experiences must be thoughtfully planned so that the evidence elicited will give the teacher and students information about what students know and can do. This means that data reveals understandings as well as misconceptions. One focus of our technical assistance work has been (and will continue to be) an emphasis on evidence; teachers should elicit evidence throughout the lesson that guides teaching and informs both teacher and student of the student’s progress toward the learning objective. The most successful formative assessment strategies gather data at the right time with the least disruption to the on-going learning in the classroom. Let’s think a bit more about each of these broad categories and what the data, the evidence, might look like within each category.

34 Observation Close observation is central to formative assessment. Examples of tools for keeping track of observed evidence include: Anecdotal notes (journal, sticky notes, e-tablet, labels) taken during lesson or noted at end of day; Running records or miscue analysis notes; and Checklists, tallies, or charts. Observation helps leaders to answer the question, “What do I see or hear that is evidence of students’ learning?” Likewise, teachers who concentrate on observables are able to note what the student says, does, makes, or writes. These behaviors provide evidence of learning and show where the student is in relation to the pre-established criteria. Keeping track of this information is one way for teachers to capture evidence of progress toward the learning objective. You see here a list of examples of strategies that teachers might use to gather and record data as they observe their students during the lesson. This list is not exhaustive. Keeping track of evidence from observations as well as feedback provided to students will look different from classroom to classroom and from lesson to lesson within the same classroom. The strategy should fit the needs of the lesson, students, and teacher. The most successful formative assessment strategies gather data at the right time with the least disruption to the on-going learning in the classroom. Regardless of the strategy selected for recording data, it is important for the teacher to make marks, comments or notes about the students throughout a class or lesson. At the end of the instructional period, the teacher may then file or save the data that is linked to the individual student. The teacher may then revisit her notes, using the data to make instructional decisions. Which students need intervention in order to close a learning gap or to correct faulty thinking? Error analysis based on incorrect work is a powerful tool for the teacher who knows the common mistakes that students make as they are learning. The teacher may differentiate instruction by reteaching to eliminate errors and misconceptions for some students while having other students move to a more rigorous extension of the original learning goal. Consider whether or not you have seen examples of teachers using observation as a method for assessing student learning. Did the teacher record data as s/he monitored student work? What happened to that data? How did instruction change as a result of teacher observing students? Did the teacher, after seeing or hearing student responses, address areas where student(s) were having problems before moving on? Remember, a strategy is not true formative assessment unless something is done with the data to improve student progress toward the learning goal.

35 Questions and Conversations
Interactions that call for a personal response provide targeted data to inform the teacher about a specific aspect of the learning. Questioning with public or private responses using hand-signals, individual dry erase boards, clickers, think-pair-share, exit ticket Surveys (written or oral) Interviews (one-on-one) Conferences When a teacher has students pause to think about their thinking and articulate that thinking process, he or she may gain information about the learning that is taking place. The teacher is now equipped with information to repair any misunderstanding and can do so immediately. If the questions or conversations reveal that there are no misunderstandings, the teacher is confident that students are ready to move on with another learning goal or a more challenging extension of the current goal. Checking for understanding can be as quick and simple as thumbs-up/thumbs-down or as involved as one-on-one interviews. Whichever formative assessment method the teacher chooses should fit the content, age of the students, and the time available. As you may have seen or discussed earlier, a teacher may choose to continue asking questions, prompting, probing and leading a student toward understanding, and allowing the student to close his or her own learning gap. Think of the questions you might hear the teacher asking and how those questions can be used as evidence of formative assessment. Can you say that your teachers use techniques to find out what every student knows? Are you concerned that teachers allow volunteer responses or choral responses more often than they gather data from individual student responses? Questions and conversations are a rich source of data about student progress toward the learning goal, provided that the teacher is using techniques that reveal what each student knows. Questions and conversations can also occur among students in a group, with the teacher observing these interactions. The teacher may be able to gather data about student understanding simply by listening to the conversation in a think-pair-share, rather than leading the conversation.

36 Self-Evaluation Student self-evaluation may incorporate reflections, reveal lingering questions, and identify perceived strengths and/or weaknesses. Rubrics and checklists Reflection journals with specific prompts Student-led conferences Exit slips Students, of course, are key players in the formative assessment process. However, as mentioned in the previous slide, the teacher does not always have to be leading a conversation or actively engaging the student in verbal questioning. Student self-evaluation may provide the teacher with new and important information about student understanding. Rubrics and checklists, for example, help students self-evaluate, but also organize their self-evaluation. These tools give students a visual reminder of the tasks or skills required for mastery of the assigned topic. They are able to see and then move to close any gaps in their learning. Have you observed students using rubrics or checklists to self-assess their own work? What did the teacher do with the student self- evaluation? Rubrics are one tool that students may use to identify personal strengths and/or weaknesses. It can be insightful for students to participate in the development of a rubric. For some students, a collection of anonymous work samples can communicate different levels of quality in a task or product they have been assigned. Have you seen teachers use work samples to inform students about success criteria? A reflection journal offers a way to collect person reflections, questions, and responses from each student. As with any other formative assessment, it is essential for the teacher to use the data provided to guide instruction for individuals and for the group as a whole. For a self-evaluation exit slip, a student may ask a question or identify a topic on which he feels he needs additional instruction or practice. This gives the student a chance to gauge his own knowledge and decide where the gaps occur. Exit slips provide a means for gathering extended, individualized responses from students. Exit tickets with no name on them provide data more broadly for the whole group and may inform pacing needs, patterns in responses, and so on. Signed exit tickets allow more granular attention to specific student strengths and weaknesses. It is important to remember that, while they are a piece of the whole picture, exit slips alone will not provide all of the benefits of good formative assessment. Once a student leaves the room, the opportunity to immediately repair any misunderstanding leaves as well. Exit slips may provide a teacher information for the next class session, but another formative assessment may be a better choice for in-the-moment interactions. The bottom line is: how does the teacher use the information to enhance the learning experience of students?

37 Artifacts Collecting artifacts of learning, and then analyzing that collection, assists teachers and students in measuring progress over a period of time. Any work produced by a student or group of students could be considered an artifact. Folders of students’ work Class or group work samples Chronological files over multiple years Teachers may choose to collect artifacts of student learning and keep them in a central location, such as a folder of an individual student’s work or of an entire class’s work samples. These folders or artifacts collected may include work samples, observation notes, student self-reflections or journals, input from parents, assessment data from summative tests/benchmarks. This collection of artifacts assists teachers with understanding student learning needs and measuring progress over a period of time (usually the school year). Artifacts may be useful in identifying and closing learning gaps when combined with the question and conversation strategies. Do any of your teachers use a system of artifact collection to gather data about student learning? Class work samples are useful to the teacher for grouping purposes or when planning and selecting learning activities. Work samples allow the student to see personal progress and may facilitate conversations about best work. Work samples may reveal patterns that help to pinpoint student misconceptions. Formative assessment data gathered from student work samples or artifacts informs the teacher as s/he prepares the lesson plan. When work samples are collected over time, they may be used to establish anchor samples. It is helpful for students to see samples of work products; in fact, some studies have shown that work samples shared with students (particularly when students are empowered to rank or rate them according to success criteria) are actually more useful to the student than using a rubric to explain quality performance levels. Chronological files that accumulate student information (work samples, assessment data, and so on) can be useful for adjusting instructional supports to meet student need. This type of file enables looking back to determine patterns in student learning. This can be informative for supporting any student but especially useful when working with students with disabilities or those who are English language learners. If a teacher is using project-based learning, there should be check points along the way to determine that students are progressing as intended. The teacher should be prepared to assist students as they work on long-term projects such a research papers or science projects. Without checking along the way and repairing student misconceptions as the project is in process, the teacher may be left with a summative assessment rather than an opportunity for formative assessment. Remember that the goal of formative assessment is to gather data that supports making adjustments in teaching and learning in the moment or along the learning continuum rather than after instruction ends. With all of the processes and products we have reviewed, it may be best for teachers to assess students in a variety of ways. A daily five question, multiple-choice check will be beneficial in some ways, but it may be more beneficial to expand assessment opportunities for students. Providing different types of assessments affords students the chance to show what they know and can do. Finally, formative assessment should not be too disruptive or take away too much valuable instructional time. In the moment interactions between students and students and students and teachers happen seamlessly throughout the instructional period. As you evaluate and critique the formative assessments your brought with you today, think to yourself: How much instructional time is needed for this? Are students silent? Are they using resources or supports? Is there an opportunity for a student to repair misconceptions through self evaluation or through working with a teacher or peer? Will a teacher immediately be able to use the data from this assessment; if not, how long does he or she have to wait? Does this assessment closely resemble the assessments I brought to sessions 2 and 3? Has everyone at my table brought the same type of assessment? Asking yourself these questions may be beneficial as you continue the work this afternoon.

38 *Formative Assessment
Formative assessment measures short-term goals and compares each student’s current learning to the pre-established criteria. Formative assessment data are used to: Inform immediate instruction, Identify student misconceptions, Uncover missing building blocks, and Close a gap that exists between what a student knows and what a student needs to learn. Now that we have reiterated some of the processes and products that represent formative assessment, let’s revisit the criteria or features of formative assessment. Recall that formative assessment is short-term, informs immediate instruction, is used to uncover misconceptions or missing building blocks, and is used to close the gap between what a student knows and what a student needs to learn (based on the comparison of student learning to specific criteria). Gathering evidence of formative assessment may require an observer to pay close attention in order to capture those subtle interactions between teacher and student or student and student.

39 Formative Assessment Activity
Select two samples of formative assessments from those you brought to this session. Answer the following questions for each assessment: Is the assessment aligned in content and cognitive level to the identified standard? How do you know? Which process or product was used to gather evidence of student learning? Approximately how much time would a student need to complete the assessment? Use the provided rubric to evaluate each assessment. Complete these steps independently as you review and evaluate the formative assessments you brought today. Please refer to your participants’ handout to see the previous slides listing the processes and products. While using the rubric to evaluate the assessment, feel free to use any adaptations you may have developed as a school or division.

40 Formative Assessment Rubric
ABSENT EMERGING PROFICIENT EXPERT Alignment Material assessed is not aligned to content of SOL it addresses. Material assessed is aligned to content, but not cognitive level, of SOL it addresses. Material assessed is aligned to content AND cognitive level of SOL it addresses. Students informed of how they are being assessed and/or on what learning. Clear, precise alignment of content AND cognitive level. Students can articulate how they are being assessed and on what learning. Data Available Assessment does not provide data or teacher does not collect data provided. Assessment provides limited data about student knowledge for SOL being assessed. Assessment provides sufficient data about student knowledge for SOL being assessed. Assessment provides clear, precise data about knowledge for SOL being assessed. Data Use Teacher does not use or plan to use data gathered, or teacher did not gather data from assessment. Students provided with an activity rather than an assessment. Teacher uses or plans to use data to gain general or broad information about student learning. Teacher uses or plans to use data to make changes in the lesson for future classes. Teacher uses or plans to use data in order to close specific learning gaps within the current classroom. Teacher articulates clear, precise, and specific plan to use data in order to close learning gaps, differentiate future instruction, and adjust pacing as needed. Inclusive Assessment provides data for fewer than 50% of students in class. Assessment provides data for more than 50% of student in the class, but it does not provide data for all* students. Assessment provides data for all* students in class. Assessment provides clear, precise data for each student in class. Please refer to the document titled Session 4: Formative Assessment Rubric. This rubric was developed to help rate the quality of a formative assessment used during classroom instruction. The only requirement for this rubric is for today’s work. If you choose to take this back to your school division, we hope that it is because it is a tool that will work for your school or that can be modified in a way that will work for your school. Notice that the rubric includes four essential components: Alignment between the assessment and the content and rigor of the standard it addresses. Without alignment, the assessment may provide irrelevant information about student progress toward the learning goal. Issues with alignment can result in students who are ill-prepared for a summative assessment that is aligned to the standards and curriculum framework. The assessment provides data that can be used to determine student knowledge of the standard being assessed. The teacher uses the data purposefully to modify instruction, to close learning gaps, and to differentiate instruction. Recall from the last few sessions where we discussed that the data piece is essential to formative assessment. See the “Data Use” row under the “Absent” column, “Students provided with an activity rather than an assessment.” The assessment is inclusive of all students, providing data for each student in the class/group. A teacher may not be able to effectively close specific learning gaps if he or she is not aware of the whole class and its progress toward a goal. Also, of course, we expect teachers to go into the classroom prepared to reach all students, not just a select few. Notice also that this rubric describes four distinct performance levels: absent, emerging, proficient, expert.

41 Formative Assessment Activity
Divide into groups of 3-4. Each participant shares one assessment and the corresponding question responses and rubric ratings with group members. As a group, (a) discuss each assessment and its ratings and (b) determine the most immediate/essential feedback that should be provided to the teacher regarding the formative assessment. Be prepared to share with the whole group. Now we will begin the partner or group work section of this activity. Steps four and five may occur in a round-robin format with sharing by one participant followed by feedback from the others in the group. After one person shares and receives feedback, the person to his/her right then shares and feedback is provided from the others in the group. Each individual can take notes on the feedback they receive. Using chart paper, each group should record the essential feedback provided for the assessments. Groups should be prepared to share out their feedback, and all participants are asked to pay close attention to any themes that may appear. Are there any global areas of strength or weakness?

42 The Lesson Objective as Formative Assessment
Let’s take a few minutes to revisit instructional objectives and continue to think about how a lesson objective, complete with behaviors, conditions, and success criteria, can actually serve as a formative assessment. Recall that the most effective formative assessment in the classroom may be the planned assessment that is embedded in instruction.

43 Why Are Objectives Important?
Objectives focus the learning for teachers and students. Students show increased student achievement when teachers communicate clear objectives to students and provide quality formative feedback to students. Students take greater risks and persevere longer when they know the objectives and criteria. This slide reminds us of the positive effects on student learning that result when teachers communicate objectives to students during the lesson. First, it helps provide a focus. This allows students to better organize information as they learn it. Second, it has been shown to increase student achievement, especially when combined with formative assessment on whether students are achieving the objective and how they can improve. Last, when students know the objectives and the criteria by which they will be evaluated, they are more likely to take risks and persevere longer. (Brookhart, 2008; Hattie & Timperley, 2007; Shute, 2008).

44 Objectives Objectives communicate clearly what the student will accomplish during the given lesson. An objective should include: Behaviors students will exhibit to show learning, Conditions under which the students will exhibit those behaviors, and Criteria the teacher will use to determine whether students meet the objective. As we saw in the Academic Review Lesson Plan and the Lesson Observation tools, objectives generally include the following pieces: Behaviors students will exhibit to show learning, Conditions under which the students will exhibit those behaviors, and Criteria the teacher will use to determine whether students meet the objective. Note: As we observe teachers, we look for evidence that the teacher actively communicates each piece of the objective to students—this could be done in a variety of ways. Objectives guide the teacher’s work and enable both teacher and student to monitor progress toward the learning goal. We are also looking for teachers to communicate the lesson objective in student- friendly language. The words and concepts should be age appropriate, but the teacher should not strip important content-specific vocabulary from the objective. If a word is used in the Standard, students are expected to know that vocabulary.

45 Activity Using Objectives from the Classroom
Select two objectives from those you brought to today’s session. Assess whether each objective is aligned to the curriculum framework in content and cognitive level. Identify the behaviors, conditions, and criteria for each objective either using multi-colored highlighters or the template provided. Select one objective that did not include all components and rewrite the objective. Be prepared to share your rewritten objective with the group. For the template, please refer back to the handout Session 2: Examining Objectives for Behaviors, Conditions, and Criteria. We are now going to work with the new set of lesson objectives that you brought to today’s session. Step one is to read over all the objectives you brought and identify five to examine more closely. Once you have done this, please proceed to step two, which is to determine if each of the five objectives you chose is aligned to the standard and curriculum framework in content and cognitive level. You will need to access the curriculum framework through the internet to do this. As we all have agreed in the past, the world’s most perfectly written objective will not be effective if it is not aligned to the curriculum standard’s content and cognitive level. However, for the purpose of practice we will proceed to step three regardless of the alignment of the objective. In step three, we ask that you read each objective and identify the behaviors, conditions, and success criteria for each one. You may indicate the different parts with different colored highlighters or you may choose to use the template that we have provided for you. Please note the verb of each objective and if it shows a measurable behavior. This especially may be helpful if you are having trouble rewriting the objective to include conditions and criteria. For example, if the objective states that a student will “understand” a concept, what does that look like? How will a teacher measure that a student “understands”? Learning objectives should be written so that the teacher can actually see and measure student progress. The verbs in these objectives should focus on what a student will make, say, do, or write. Instead of “The student will understand the difference between cause and effect,” what could the student make, say, do, or write to SHOW that understanding? As you work through your objectives, look for verbs that do not relate to a specific student behavior and therefore make it more difficult for teachers to measure progress. Some examples of verbs that do not show behavior are understand, learn, and know, but you may find others. Step 4: Choose one objective and rewrite it to include behaviors, conditions, and criteria. Step 5: At this point, we will ask for volunteers to share one of their selected objectives and their thinking on the presence or absence of behaviors, conditions and criteria for success. The presenter will facilitate this process and use questioning to probe the participants’ thinking.

46 Additional Activity: Formative Assessment and Objectives
Review a formative assessment brought today. If the formative assessment did not score Proficient on the rubric, note what changes could be made to the assessment. Using the formative assessment and any suggested changes as a guide, write a corresponding objective that includes the behaviors, conditions, and criteria required for the student to complete or produce the assessment. Check the standard and curriculum framework to ensure the objective is aligned to the standard in content and cognitive level. As an additional activity combining our work with formative assessment and objectives, please complete these four steps. If a teacher has embedded a formative assessment within the objective, what would that look like? For this activity, we essentially will be working backwards as we use a formative assessment to write an objective.

47 How Do We Know We Are Making Progress?
Questions to Consider How are we focusing on teaching and learning in terms of measurable student progress? How are we following a process for progress monitoring at the classroom level, the school level, and the division level? How are we making adjustments based upon progress monitoring? Here are the questions we leave you with after each session. As instructional leaders (at the school and division levels) you are likely to be engaged in regular conversations around these questions. Remember the cycle embedded in slide four (outcomes for AARPE Part 2)? These questions link that cycle to the work of school improvement. You also see a cycle embedded in the goal-setting process; these questions may guide you as you reflect on your work with goal setting.

48 Classroom, School, Division
How Do We Know We Are Making Progress? Classroom, School, Division Set clear and measurable goals. Implement action steps with fidelity. Monitor and provide feedback. Adjust based upon monitoring and data. Repeat the cycle. As you move forward to improve instruction, increase student achievement, and improve school effectiveness, it is valuable to know what steps to take to ensure that we are making progress. We know that school improvement requires a plan with clear and measurable goals. The plan requires action steps that are implemented with fidelity, and requires ongoing monitoring, feedback, and adjustments or corrections along the way. Those adjustments must be evidence-based, using the data that is gathered during monitoring.

49 Next Steps (Completed by Next Session-1)
Division leaders will meet with AARPE principals to develop plans for the principals’ delivery of professional development to appropriate school staff on (a) evidence-based look-fors that support Standard 4 and (b) the relationship between instructional objectives and formative assessment. Plans need to include date, time, draft agenda items and be incorporated under the appropriate indicator and task in the SIP. For next steps, division leaders should meet with and support AARPE principals as the principals deliver appropriate professional development on some or all of the concepts we covered today: a) evidence-based look-fors that support Standard 4 and b) the relationship between instructional objectives and formative assessment.

50 Next Steps (Completed by Next Session-2/3)
Division staff will develop plans for AARPE professional development for all 2nd year warned schools. Plans need to include date, time, draft agenda items and be incorporated under the appropriate indicator and task in each school’s SIP. Completed agreed upon number of Inter-rater reliability (IRR) walkthroughs and observations (division leaders, principals, OPs). Division staff will provide professional development to principals of 2nd year warned schools. Remember to please bring a hard copy of that plan or agenda. Also, please bring a completed agreed upon number of Inter-rater reliability (IRR) walkthroughs and observations. This next step is for Division leaders, principals, and OPs.

51 Next Steps (Completed by Next Session-4)
Samples of: Lesson objectives Behavior, conditions, criteria Collected formative assessments Observations Feedback on Teacher Standard 4 Finally, please bring the following samples as we continue to work on Standard 4, Assessment of and for Learning.


Download ppt "Aligning Academic Review and Performance Evaluation (AARPE Part 2)"

Similar presentations


Ads by Google