Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introductions – all participants Session Objectives:

Similar presentations


Presentation on theme: "Introductions – all participants Session Objectives:"— Presentation transcript:

0 District Improvement Facilitators Network
August 18, 2014 All up front

1 Introductions – all participants Session Objectives:
SIFN/DIFN Structure PLCs Program Evaluation Tool AER and DPR MiSchoolData & CNA Facilitated Work Time 8:30-8:45 Sandy

2 Press for Clarification Collaborate Share your Thinking
Working Agreements Participate Fully Press for Clarification Collaborate Share your Thinking Sandy

3 Mission of DIFN and SIFN
It is the mission of the Jackson County School Improvement Consortium to support a community of collaboration using a continuous School Improvement process in order to increase student achievement. Sandy

4 Agenda 8:30 Introductions and Session Objectives DIFN/SIFN Structure
PLCs Program Evaluation Tool 9:30 Break 9:45 AER & District Process Rubrics MiSchoolData & CNA 11:30 Lunch 12:00 Facilitated Work Time (Optional) Maeghan

5 District Improvement Facilitators Network
August 18, 2014 Program Evaluation, AER, DPR, CNA December 10, 2014 District Process Rubrics Consolidated Application Progress Monitoring May 21, 2014 District Improvement Plan Program Evaluation Data/Assessment Maeghan

6 School Improvement Facilitators Network Elementary & Secondary
Sept 24/Oct 1 Review SIP Strategy Implementation Guides Stakeholder Rollout Perception Data Action Step Plans Jan 22/Jan 29 Progress Monitoring Local Data School Process Rubrics/ASSIST Maeghan

7 School Improvement Facilitators Network Elementary & Secondary
March 19/March 20 Executive Summary Stakeholder Feedback Diagnostic Additional Requirements Diagnostic Programs Monitoring/Program Evaluation May 12/May 13 Data/Assessment School Data Profile Analysis Title One Diagnostic Program Evaluation Tool Maeghan

8 Jennifer

9 Understanding Change = = = = = = = Skills Resources Jennifer
Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = Second Order Change Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = Sabotage Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = Confusion Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = Anxiety Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = Anger Jennifer Optional Activity The School Improvement process is a change process. It is important to note that not all change is of the same magnitude and some changes have greater implications than others for various stakeholders. Before implementing changes that result from the School Improvement Plan, it is important to recognize the level of change the plan calls for. Anticipating the response to the change, and even writing activities into the plan to help those implementing the plan accommodate the change, can go far in ensuring its success. Change theorists have suggested that there are two levels of change: first order change and second order change. First order change is characterized by modifications of past practice within existing paradigms and implemented with existing knowledge and skills. It might also be thought of as doing work in new ways resulting from “in-the-box” thinking. Second order change is characterized by a break with past practice and working outside of the existing paradigms. To implement such change requires new knowledge and skills. It might also be thought of as doing new work resulting from “out-of-the-box” thinking. An example of first order change might include moving to 90-minute teaching blocks while simply doing the same thing as a 45-minute block but for twice as long; second order change would be using the teaching blocks for a different kind of learning such as projects and collaborative groups. It is important to recognize which changes are of which order. Recognizing the difference can help leaders select practices that are appropriate for the stakeholders who will be implementing them and generally results in more sustainable efforts and a positive impact on student achievement. On the other hand, a negative impact will likely result if we use first order change when second order change is needed or if we assume that both types of changes have the same impact. Activity: Note the elements of change identified across the top of the chart. (Payoff refers to whether those implementing the change believe that the change will benefit them in some way.) Change theorists believe that if all elements are in place, second order change will result. If one of the elements is missing, other things will result. Post the following terms someplace in the room or have participants write them on the a piece of paper: anger, anxiety, confusion, false starts, first order change, sabotage. Have them individually or in small groups predict what will happen if each of the elements is missing. E.g., in the second row, what will result if trust is absent? Once participants have completed the activity, reveal the answers. (Note: The slide is set up so that the boxes in the last column are empty until revealed one at a time with the forward key on the computer.) Conversation: Have participants identify a past school improvement practice that might not have been successful and indicate what kind of change might have been called for and which of the elements might have been missing that may have caused the lack of success. Also talk about what kinds of activities might be written into the plan to increase chances that changes called for in a School Improvement Plan might be successful. Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs Sporadic Change = Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = False Starts Trust Vision Skills Resources Payoff Action Plan Shared Values/Beliefs = First Order Change Ambrose, 1987 “Managing Complex Change

10 Unknowns for the Coming Year
Known changes Program Evaluation tool SI Framework changes and support Uncertain SI Framework approval timeline Assessment details Focus & Priority school requirements & support Steve

11 The Premise of PLC’s No single person has all the expertise, skill and energy to lead a district, improve a school, or meet the needs of every child in his or her classroom Dispersed leadership is a prerequisite for bringing the big ideas of the PLC process to life Leaders help sustain the PLC process by removing obstacles and celebrating progress Steve: This slide introduces the PLC model. Is there any research in any organizational context that says, “people work better in isolation?” It doesn’t exist ndex=2&list=PLG4LE4_x42j-jbQe- S4DVo9HXQJnl0ft9 Play video from 1:26 to completion

12 The Importance of PLCS Sandy

13 Distributed Leadership of PLC’s
Leadership is characterized by . . . Working with others to establish a shared sense of purpose, goals, and direction Persuading people to move in that direction Clarifying the specific steps to be taken to begin moving in the right direction Providing the resources and support that enable people to succeed at what they are being asked to do Steve: If no single person has the capacity/skills/talents to ensure learning for all students, what is a leader? This slide begins to define leader characteristics in collective terms.

14 Four Essential Questions
What do we expect our students to learn? How will we know they are learning? How will we respond when they don’t learn it? How will we respond when they already know it? This will lead to the new hand out from Mark at SolutionTree that connects the definition to the 4 questions to how we can/should respond to each

15 Four Pillars MISSION PILLAR Why Do We Exist?
Define Fundamental Purpose Clarify Priorities & Creates Focus The words of a mission statement are not worth the paper they are written on unless people begin to do differently. -DuFour Sandy

16 Four Pillars MISSION PILLAR VISION PILLAR Why Do We Exist?
What Must We Become? Defines Fundamental Purpose Describes A Compelling Future Clarify Priorities & Creates Focus Gives School A Direction “A vision builds trust, collaboration, interdependence, motivation, and mutual responsibility for success. Vision helps people make smart choices, because their decisions are made with the end results in mind…vision allows us to act from a proactive stance, moving toward what we want… vision empowers and excites us to reach for what we truly desire.” -Blanchard, 2007, p.22 Sandy

17 Four Pillars MISSION PILLAR VISION PILLAR VALUES PILLAR
Why Do We Exist? What Must We Become? How Must We Behave Defines Funda-mental Purpose Describes A Compelling Future Collective Commit-ments Clarify Priorities & Creates Focus Gives School A Direction Guides Individual Behavior “(High-achieving schools) build a highly collaborative school environment where working together to solve problems and to learn from each other become cultural norms.” (WestEd, 2000, p.12) Sandy

18 Four Pillars MISSION PILLAR VISION PILLAR VALUES PILLAR GOALS
Why Do We Exist? What Must We Become? How Must We Behave What Steps? When? Defines Funda-mental Purpose Describes a Compelling Future Collective Commit-ments Targets and Timelines Clarify Priorities & Creates Focus Gives School Direction Guides Individual Behavior Establishes Incremental Steps Sandy

19 Seven Keys to Effective Teams
Embed collaboration in routine practices of the school with a focus on learning Schedule time for collaboration into the school day and school calendar Focus team on critical questions Make products of collaboration explicit Establish team norms to guide collaboration Pursue specific and measurable team performance goals Provide teams with evidence of students learning to improve professional practice Sandy

20 “On the plus side, this gives our PLC something to chew on.”

21 Seven Keys to Effective Teams
Embed collaboration in routine practices of the school with a focus on learning Schedule time for collaboration into the school day and school calendar Focus team on critical questions Make products of collaboration explicit Establish team norms to guide collaboration Pursue specific and measurable team performance goals Provide teams with evidence of students learning to improve professional practice Sandy

22 “Can anyone, anyone, tell me how a semicolon is used other than in an emoticons?!”

23 Seven Keys to Effective Teams
Embed collaboration in routine practices of the school with a focus on learning Schedule time for collaboration into the school day and school calendar Focus team on critical questions Make products of collaboration explicit Establish team norms to guide collaboration Pursue specific and measurable team performance goals Provide teams with evidence of students learning to improve professional practice Sandy

24 Example of a Timeline of Team Products
By the end of the… 2nd week: Team norms 4th week: Team SMART goal 6th week: Common essential outcomes 8th week: First common assessment 10th week: Analysis of student performance on first common formative assessment

25 The Importance of Teams Products
Without discrete team work-products produced through the joint, real contributions of team members, the potential of teams to dramatically improve performance goes untapped. (Katzenback & Smith, The Wisdom of Teams: Creating the High-Performance Organization, 1993, p. 90) Sandy

26 Celebrate Visible measure of progress are critical for motivating and encourage educators to persist in the challenging work of improvement. Even the most dedicated and optimistic among us will stop if there’s no sign that what we’re doing is making a difference, or might make a difference eventually.” -Elmore & City, 2007

27 District Team Meeting Structure
Fall AER Evidence Collection Progress Monitor Winter DPR Consolidated Application/Budget Amendment Spring District Improvement Plan This meeting schedule looks different from a school improvement meeting schedule

28 AERAnnual Education Report

29 Required Components Annual Education Report AER Components
District Letter Combined Report Posting on Website Step by Step Guide for Completion -handout -posted on website

30 MiSchoolData.org

31 District Process Rubrics
Review your score in each strand… Discuss your rankings What were the district’s strengths? Where did the district need improvement? How does this impact student achievement? Looking at the comments for each strand, what progress has been made? What are the next steps? What is the supporting evidence? Print each district’s DPR Summary Steve: Provide time for each to consider strengths and weaknesses.

32 District Process Rubrics
Plan for communicating to stakeholders Communications: PTO, Newsletter, Website, Discussion Forums, Staff Meeting, School Board Meeting Plan for Survey format Through Advanc-Ed SurveyMonkey.com Custom through JCISD Steve: This is one way communication but you may also want to consider two way communication…how are stakeholders providing feedback?

33 Changes are pending to the School and District Improvement Frameworks
Revised DIF 4 Strands 10 Standards 10 indicators Revised SIF 4 Strands 10 Standards 0 Benchmarks 26 Indicators Steve: Provide time to analyze the new District and School Process rubric draft documents.

34 MDE Program Evaluation Tool
Meaghan

35 Program Evaluation Tool
NEW Diagnostic for the school year Impact student achievement and close gaps for the subgroups To ensure that high quality planning, implementation and evaluation are part of the Continuous Improvement Process To ensure ongoing engagement of multiple stakeholders (students, teachers, parents/community, administrators) in the planning and evaluation process To maximize the use of resources to impact student learning To meet state and federal requirements

36 State and Federal Requirements
MICHIGAN FEDERAL Annual evaluation of the implementation and impact of the School Improvement Plan Modification of the plan based on evaluation results Annual evaluation of all federal programs—effectiveness & impact on student achievement, including subgroups Modification of the plan based on evaluation results Share the federal mandates listed below with participants and use the citation as needed: Guided by evidence-based practices, ESEA requires all states and their districts (subgrantees) to progress monitor and evaluate the effectiveness of strategies/initiatives/programs/reforms funded by federal monies in order to examine their impact on students’ academic achievement. For example, Schoolwide programs must include annual evaluation as part of the annual cycle for examining successful implementation and impact on the success of all subgroups; Plans of Targeted assistance schools must be progress-monitored and evaluated for impact on the achievement of targeted students. (Title I Part A: 34 CFR (c); Title I, Part A [Sec (c)(2)(B) - Targeted Assistance; Sec (b)(1)(B)(iii)(II) – Schoolwide] Title III -English Language Acquisition Program (English learners) requires districts to fulfill the program improvement responsibilities, annually evaluate the English learner program and use ELs’ academic and language outcomes to determine action steps for restructuring, reforming and improving all relevant EL programs, activities and operations relating to language instructional education programs and academic content instruction. [ESEA Sec. 3115(a)(3) and 3121] Title I Part C Migrant Education Program requires conducting a comprehensive needs assessment, establishing clear goals whose implementation is monitored to ensure impact on migrant students’ achievement. Local migrant programs are required to conduct an annual program evaluation to determine impact of programs and services on academic achievement of migrant students Sections 1301(4); 1304 (b)(1,2); 1306 (a, c & d) Title I Part D, Subpart 1 or 2: Neglected and Delinquent Programs Entities receiving these funds are required to be evaluated annually for impact of eligible students’ achievement and program outcomes. Section 1432 (a) (1-5) Title II The district must have a written process in place to evaluate how Title II, Part A activities will impact student achievement; Title II Program services must be evaluated annually for effectiveness and impact on student achievement. ESEA Section 2122(b)(9)(D) The Title II, Part A Class-Size Reduction initiative must follow Michigan Department of Education policy requirements and the district must have a process in place to evaluate the effectiveness of the initiative ESEA Sections 2122(b)(1)(B) and 2122(b)(2) If asked whether priority schools are required to use the MDE Evaluation Tool, respond by saying that Years 2 and up priority schools are required to use the Tool.  Year 1 schools could use it more as a planning tool as they haven’t implemented their plan yet.  SRO is looking at it as a way to assess the readiness of schools to implement their plans.   Note on slide: The picture to the left is an insert with a summary of the NCLB requirement and an excerpt of the Michigan’s Revised School Code.          ISDs/RESAs are required by PA25 to provide technical assistance to schools and districts to develop annual evaluations. ESEA requires annual evaluations of programs funded by the federal programs such as Title I, Part A, C, D; Title II and Title III.

37 Program Evaluation Timeline
District/School Improvement Plans for : Include program evaluation activities to support Program Evaluation as part of the Continuous Improvement Process Implement Program Evaluation activities throughout the school year Summer 2015 and Beyond Sustain professional learning to discuss successes, challenges, and any necessary follow-up training materials and support systems June 30, Program Evaluation submitted in ASSIST A completed program evaluation using the MDE Program Evaluation Tool will be required for submission of the Consolidated Application for 2015 – 2016. Read through the timeline and address any items that need clarification. Spell out the acronyms: OFS= Office of Field Services ISD= Intermediate School Districts SIFN= School Improvement Facilitators’ Network OEII: Office of Education Improvement and Innovation MICSI=MI Continuous School Improvement LEAs= Local Educational Agencies SRO= School Reform Office If asked whether priority schools are required to use the MDE Evaluation Tool, respond by saying that Years 2 and up priority schools are required to use the Tool.  Year 1 schools could use it more as a planning tool as they haven’t implemented their plan yet.  SRO is looking at it as a way to assess the readiness of schools to implement their plans.  

38 Program Evaluation Tool

39 Program Evaluation Tool
What are districts required to evaluate? ONE evidenced-based strategy/program/initiative that would make greatest impact on student achievement District level initiative

40 Questions for Evaluation
Knowledge and skills? Readiness? Opportunity? Planning for evaluation! The main questions used in the MDE Program Evaluation Tool are: What was the impact of the strategy/program/initiative on students? What was the readiness for implementing the strategy/program/initiative? Did participants have the knowledge and skills to implement the plan? Was there opportunity for high quality implementation? Was the strategy/program/initiative implemented as intended? Impact on students? Implemented as intended?

41 Program Evaluation Tool
Select Diagnostics & Surveys tab

42 Program Evaluation Tool
Select Start Diagnostic

43 Program Evaluation Tool
Under Choose a Template, select Program Evaluation Tool

44 Program Evaluation Tool
Name it under Description and select Start

45 Program Evaluation Tool
Select the section heading to view and/or respond

46 Program Evaluation Tool
Select respond to answer a question

47 Program Evaluation Tool
Strategy/Program/Initiative Description Review your District Improvement Plan Select a strategy Login to ASSIST Respond to questions from first section

48 Program Evaluation Tool
Resources ASSIST Guide MDE Program Evaluation Tool FAQ MDE Rubric for Review MDE Program Evaluation Tool Chart

49 Scorecard Accountability
Jennifer: Scorecard Accountability-system changes in colors MDE-changes in a few tags-still same system-results released in August sometime

50 Scorecard Accountability
Open to the public

51 Scorecard Open to the public

52 Top to Bottom Ranking Open to the public

53 Scorecard Summary From MDE website or BAA or MiSchoolData-No log in needed for the list

54 Top/Bottom 30 analysis Hover over graph it will give you the number of students in each category Choose one content area to look at per report.

55 Identifying Bottom 30% students
Steve: Downloading the file

56 Identifying Bottom 30% students
Downloading the file You can further select individual school buildings for their data files

57 Identifying Bottom 30% students
Downloading the file

58 Identifying Bottom 30% students
Once file is downloaded, you can filter to find student-level information Bottom 30% students list are unique for each subject area In bottom 30% columns: 1=top 30% 2=Middle 40% 3=bottom 30% Working with the file File is huge! (52 X # of students)

59 Key columns in the BAA data file
Reading: AB= True & AG = 3 Writing: AB=True & AS=3 Math: AB=True & BB=3 Science: AB=True & BN=3 Social St.: AB=True & BW=3 Working with the BAA data file:

60 Additional Training Support
Additional support is available on the JCISD website, under Programs and services, data resources. A 10 minute webinar is available to support you. It starts assuming you have downloaded the file in XL format & guides you through basic data filtering. From this, you can create student lists of your bottom 30% in each subject area.

61 For additional support
Contact your Region SI Coordinator Maeghan McCormick Sandy White Steve Doerr Jennifer Fox Here’s where to turn if you have further questions.

62 Impact implementation is having on students
What do we monitor? Jennifer Adults are implementing the strategy with fidelity Impact implementation is having on students

63 This slide from Dr. Doug Reeves shows data comparing what happens to student achievement when adult implementation is or is not monitored. Essentially, research indicates that this is a direct correlation between the degree of implementation and the impact it will have. Leadership and Learning Center 2010

64 Monitoring Implementation
Demographics Student Outcomes Perceptions School Processes Four different kinds of data that you have available. How does your district approach them?/align them? Think about each type of data and for what goal/strategy/activity that you might collect for. Increase enrollment of females in Science courses.-demo data-counselors pull enrollment numbers Increase proficiency by 2.5% on MEAP-student outcome-analyze state testing results (end of year) Teachers will implement PLCS-school processes-meeting agendas Perceptions-increase math proficiency-perception from parent-analyze survey results The data you collect should be measurable and not only based on opinion or observation. Remember to collect the four different types of data – demographic, student outcome, perception and process – as much as possible. Using multiple types of data gives a more accurate picture of what is really happening. Meeting schedules, meeting minutes, ongoing student data and walkthrough data are just a few types of evidence that can provide useful data. See Question #4 in the MDE Program Evaluation Tool using the Course Resources link for more ideas and suggested evidence on monitoring implementation fidelity.

65 Comprehensive Needs Assessment
Why? A “needs assessment” is a systematic set of procedures that are used to determine needs, examine their nature and causes, and set priorities for future action Helps to provide a basis for the allocation of funds Places data in one location for access/transparency Multiple formats but three key components: Exploring Status Quo, Gathering & Analyzing Data, Decision Making Could use multiple formats Take from processes already completed-targeted/schoolwide Re-assess needs for allocations Places data/transparency all in one location Could use multiple formats/tied to strategic plan, in community teams, etc.

66 Comprehensive Needs Assessment
How? Some Basic Suggestions for Steps you Might Take… Step One: Determine Purpose What decisions will be made/enhanced with information? Step Two: Determine what existing information is available and who and how will it be collected and analyzed? Step Three: Determine what information is still needed to make the best possible informed decision. How will it be collected, analyzed and compared to existing information? Step Four: How will decisions be made regarding the information? How will stakeholders be involved? Steps you might consider

67 Comprehensive Needs Assessment
Multiple Samples Online Under Resources, template following the old MDE process Design to align with a strategic plan or other initiative District Plan for collecting: Process Data-SPR/DPR comparison? Student Achievement-consistent areas of concern/success? Student Outcomes-graduation, attendance, course alignment/success? Perception Data-district alignment of survey schedules/information? -When do you compare Improvement Plans or Process Rubrics-How do you set goals? -What do you do with last years achievement data/how is this shared? What effect might this have on teaching and learning? -Are student outcomes/soft skills/study skills compared to achievement data-number of students passing AP compared to attendance? -Are multiple surveys from multiple grades each year taken? What is done with the results-what is the focus of the survey? What is done with the information?

68 District Data Profile Analysis/CNA-
Demographic Grade Level Achievement Subgroups Gaps and Trends Non Academic Data Summary Data CNA is required for Title 1 and Priority schools Why do it if not a Title or Priority school? Gives a clear snapshot of the district make-up, organized MISchoolData information into a cohesive format, only necessary to update after you complete it the first time, should have understanding of the data. Handout #1-Step by Step Guide and Template (older with some N/A items) Handout #2-So, what do other schools do with it-Sample Waterford District Profile-step above the AER-used then as communication documentation Handout #3-for JCISD-Sample excel pre-loaded with county/state charts/graphs file but not with subgroups for a one, two three building **Everything on flash drive including prior SPR 40 DPR 40 items-step by step guides and templates

69 Students Near Proficiency
After or before BAA you might want to look at MiSchoolData D 4 SS Reports Who are the bubble students? How did a cohort do? For students Near Proficiency choose a cut score of 5

70 Students Near Proficiency
Bubble students-with log in only Click on graph and the student list with names and pic numbers will appear

71 Students Near Proficiency
Ance was the only one student who scored the highest Then this report give you the breakdown of content area Imagine what teachers might do with this information?

72 Cohort Proficiency MEAP-Kids who moved in one year from 3 low to 3 high Hover above circles to click and see scale scores and student names Arrow is to level fours who stayed level fours- Read to the right, 4’s that went to 3’s, 4’s that went to 2’s, next row would be 4’s that went to 1’s big jump-what could you do? Have next level teachers look at student bubble scores and those who moved. Look for teachers from those kids that moved.

73 Monitoring Implementation
Demographics Student Outcomes Perceptions School Processes Four different kinds of data that you have available. How does your district approach them?/align them? Think about each type of data and for what goal/strategy/activity that you might collect for. Increase enrollment of females in Science courses.-demo data-counselors pull enrollment numbers Increase proficiency by 2.5% on MEAP-student outcome-analyze state testing results (end of year) Teachers will implement PLCS-school processes-meeting agendas Perceptions-increase math proficiency-perception from parent-analyze survey results The data you collect should be measurable and not only based on opinion or observation. Remember to collect the four different types of data – demographic, student outcome, perception and process – as much as possible. Using multiple types of data gives a more accurate picture of what is really happening. Meeting schedules, meeting minutes, ongoing student data and walkthrough data are just a few types of evidence that can provide useful data. See Question #4 in the MDE Program Evaluation Tool using the Course Resources link for more ideas and suggested evidence on monitoring implementation fidelity.

74 Questions All of us: Field questions / updates about ASSIST

75 Lunch 11:30-12:00

76 Facilitated Work Time 12:30-3:30

77 Evaluation and Feedback

78 Questions/Comments? Please contact:
Susan Townsend Maeghan McCormick Sandy White Steve Doerr Jennifer Fox Or visit the MDE - School Improvement website


Download ppt "Introductions – all participants Session Objectives:"

Similar presentations


Ads by Google