RE-THINKING HOW SCHOOLS IMPROVE

Slides:



Advertisements
Similar presentations
When Students Can’t Read…
Advertisements

School Based Assessment and Reporting Unit Curriculum Directorate
A Guide to Implementation
The Classroom-Focused Improvement Process
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Professional Learning Communities at Work
Managing Change Principal Leadership Academy November 2012.
INSTRUCTIONAL LEADERSHIP: CLASSROOM WALKTHROUGHS
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Bureau of School Improvement. It is Possible! It happened here: The Florida Story... Closing The Achievement Gap No Excuses.
Leading the Learning The Role of the Instructional Leadership Team ESC, West Instructional Directors July 30, 2014.
Summer Institutes 2013 Change Teacher Practice Change Student Outcomes.
Best Practices. Overview of Best Practices Literacy Best Practice Documents: Were developed by curriculum staff and area specialists, with coaches’ and.
PLT Professional Learning Teams Weaver Elementary School January, 2013.
This slide show presentation is located on the APEL website: Click on the link: “2015 Summer Leadership Conference.
+ Hybrid Roles in Your School If not now, then when?
Leadership for Student Achievement National School Boards Association.
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
PDC Procedures – Individual Growth Action Plan The Individual Growth Action Plan (IGAP) is a plan each individual completes describing professional.
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Southern Regional Education Board HSTW An Integrated and Embedded Approach to Professional Development and School Improvement Using the Six-Step Process.
Examining Monitoring Data
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
Assessment: Purpose, Process, and Use HMR Grade 1.
Improving Teaching and Learning: One District’s Journey Curriculum and Instruction Leadership Symposium February 18-20, 2009  Pacific Grove, CA Chula.
Instruction, Assessment & Student Achievement Presented: September 23, 2013 Bessie Weller Elementary School.
Read On, Indiana! Anna Shults, Reading Specialist John Wolf, Reading Specialist Indiana Reading Initiatives.
SEISMIC Whole School and PLC Planning Day Tuesday, August 13th, 2013.
40 Performance Indicators. I: Teaching for Learning ST 1: Curriculum BE A: Aligned, Reviewed and Monitored.
Literacy Partner’s Meeting Wednesday, October 22 nd Moderated Marking: The What, The Why, The How.
Instructional Coach Training September 30, 2010 AIMS: Generating Reports to Use with CFIP 1.
RE-THINKING HOW SCHOOLS IMPROVE: A Team Dialogue Model for Data-Based Instructional Decision Making.
The WHY, WHAT, and HOW of CFIP
RE-THINKING HOW SCHOOLS IMPROVE: A Team Dialogue Model for Data-Based Instructional Decision Making Dr. Michael E. Hickey Dr. Ronald.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
HCPSS Overview of CFIP: Classroom-Focused Improvement Process
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Teresa K. Todd EDAD 684 School Finance/Ethics March 23, 2011.
1 Principals’ Student Achievement Meeting April 15, 2009 Leadership for Assessment Literacy From Paper to Practice.
Instructional Coaches Academy (ICAD3) 1.  Choose a movie title that describes your school experience and why.  Discuss in your table groups. 2.
Professional Development PLC Lead Training Cultural Shifts: Rethinking what we do and why we do it Together, we can make a difference.
Distinguished Educator Initiative. 2 Mission Statement The Mission of the Distinguished Educator is to build capacity in school districts to enable students.
Leading at All Levels to Support MDC Implementation
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Professional Development PLC Lead Training Together, we can make a difference.
When dealing with colleagues in PLC’s…. Welcome Professional Learning Communities Presenters: Pam Hughes & Kristin Magee
Literacy Coaching: An Essential “Piece” of the Puzzle.
What do you need for your learning? Think about a time outside of education where you learned to do something successfully….
Warilla Public School Journey of Literacy Development with Primary Students.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Connecticut Accountability for Learning Initiative District and School Capacity Building Leadership No Child Left Behind Partnerships & Professional Learning.
Zimmerly Response NMIA Audit. Faculty Response Teacher input on Master Schedule. Instructional Coaches Collaborative work. Design and implement common.
CAPS: COACHING TEACHERS Facilitator: Dr. Lynne Paradis BELIZE LITERACY PROGRAM June 2011.
Action Research Purpose and Benefits Technology as a Learning Tool to Improve Student Achievement.
90/90/90 Leadership Summit District Leadership Team
Assessment for Learning “Assessment for Learning”: A brief synopsis of a PLC working towards higher student achievement through assessment, data-based.
 Effective monitoring of adult actions – teaching and leadership – strongly linked to gains in student achievement  “Implementation gap” between intention.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
By: Miss Michelle M. Brand Pine Grove Area Elementary School PSCA President-Elect.
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
Mentoring: from Teacher Candidate to Successful Intern
By Pam Rumage and Carmen Carr White Station Middle School
Lecturette 2: Mining Classroom Data
Linking Evaluation to Coaching and Mentoring Models
Presentation transcript:

RE-THINKING HOW SCHOOLS IMPROVE A Brief Introduction to the Classroom-Focused Improvement Process CFIP

“Every organization is perfectly designed to get the results it achieves.” --W. Edwards Deming Is your school “designed” to the specifications of the old or the new paradigm?

Data- and knowledge-driven schools and school districts use data for two major but different purposes: Accountability (to prove) Instructional decision making (to improve)

Data answer different questions. For accountability (data to prove) SUMMATIVE: - “How many students passed?” - “Who passed and who didn’t?” For improvement (data to improve) FORMATIVE: - “What do the students know?” - “What do the students not know and what are we going to do about it?”

Sources of Student Achievement Data External assessment data Benchmark or course-wide assessment data Individual teacher assessment data --Supovitz and Klein (2003)

The Hierarchy of Data for Accountability Purposes External (State & National) Assessments System Benchmark Assessments Common School or Course Assessments Classroom Assessments of Student Work

The Hierarchy of Data for Instructional Decision Making Classroom Assessments of Student Work Common School or Course Assessments System Benchmark Assessments External (State & National Assessments)

What does it take to “improve” a school? “School improvement is most surely and thoroughly achieved when teachers engage in frequent, continuous, and increasingly concrete and precise talk about teaching practice . . . adequate to the complexities of teaching, [and] capable of distinguishing one practice and its virtue from another.” --Judith Warren Little “Teachers as Colleagues,” in V. Richardson-Koehler (Ed.). (1998). Educators Handbook. White Plains, NY: Longman.

Critical Teacher Behaviors in a Strong Professional Learning Community Reflective dialogue De-privatization of practice Collective focus on student learning Collaboration Shared norms and values --Kruse, Louis, and Bryk, 1994

It is becoming increasingly clear that schools improve because student performance improves; and student performance improves because teachers at the classroom level: Carefully assess student learning Examine the results of their assessments Implement needed enrichments and interventions for students Consider the implications of assessment results for their future teaching Adjust their practice accordingly

The Classroom-Focused Improvement Process: Uses real-time, current data Is specific to each course or grade level Incorporates collaborative teaching teams and individual teachers Addresses individual students’ needs Brings together data from several assessment sources Results in instructional improvements that can be integrated into daily lesson plans (“job embedded”) (continued on next slide)

The Classroom-Focused Improvement Process (cont.): Provides for in-class enrichments and interventions that can be re-directed frequently if they are not working Helps teachers perceive the data analysis process as a worthwhile use of their time Values the input of teachers as the most important instructional decision makers

The Six Easy CFIP Steps 1. Be sure everyone understands the data being analyzed. 2. Pose a question or two that the data can answer. 3. Look for class-wide patterns in the data. 4. Act on the class patterns, including re-teaching, if needed. 5. Address individual students’ needs for enrichment and intervention that remain after re-teaching. 6. Decide on and implement at least one way that instruction will be improved in the next unit.

CFIP Step 1: Understand the data source. Build ASSESSMENT LITERACY with questions like these: What assessment is being described in this data report? What were the characteristics (the “quarks”) of the assessment? Who participated in the assessment? Who did not? Why? Why was the assessment given? When? What do the terms in the data report mean?

CFIP Step 2: Identify the questions that can be answered by the data. All data analyses should be designed to answer a question. Unless there is an important question to answer, there is no need for a data analysis.

CFIP Step 3: Look for class-wide patterns in a single data source. What do you see over and over again in the data? What are the strengths of the class? What knowledge and skills do the students have? What are their weaknesses of the class? What knowledge and skills do the students lack?

CFIP Step 4: Act on the class-wide patterns. What instructional factors might have contributed to the class-wide patterns? What will we do to address patterns of class needs? How and when will we reassess to determine student progress and the effectiveness of our instruction?

Six Easy CFIP Steps The first CFIP dialogue might only get this far: CFIP is an ongoing circular—not linear—process. The first CFIP dialogue might only get this far: Make sure everyone understands the data being considered. 2. Identify a question or two that the data can answer. 3. Look for class-wide patterns in the data. 4. Decide what to do about the class patterns. In many situations, the CFIP dialogue will then be put on hold until any needed re-teaching occurs.

CFIP Step 5: Drill down to individual students CFIP Step 5: Drill down to individual students. Identify needed differentiations. Which students need enrichments and interventions? On what should enrichments and interventions focus? How will we deliver interventions so that students do not lose future direct instruction? How will we assess the effectiveness of the interventions and enrichments?

Which students need enrichments and interventions? CFIP Step 5: Drill down to individual students. Identify needed differentiations. Which students need enrichments and interventions? On what should enrichments and interventions focus? How will we deliver interventions so that students do not lose future direct instruction? How will we assess the effectiveness of the interventions and enrichments?

CFIP Step 6: Reflect on the reasons for student performance CFIP Step 6: Reflect on the reasons for student performance. Identify and implement instructional changes in the next unit. How will we change instruction in our next unit? Content focus . . . Pacing . . . Teaching methods . . . Assignments . . . .

CFIP Step 6: Determine how we will measure the effects of our new instructional strategy. How will we measure the success of our new instructional strategy? When will we review the data again to determine the success of the enrichments and interventions? What do the data not tell us? What questions about student achievement do we still need to answer? How will we attempt to answer these questions? How well did the CFIP session go? How could we make our next meeting more effective?

Strive for “deep implementation” of the strategies and interventions your data analysis leads you to.

What Does “Deep Implementation” of Data Analysis Look Like? There is a limit on number of initiatives a school undertakes at a time. (Reeves says six is max. I think six is too many.) 90% of the appropriate faculty is participating. There is widespread understanding of reasons for the data analysis process, even if there is not complete “buy-in.” The data analysis process is discussed regularly at staff meetings.

What Does “Deep Implementation” of Data Analysis Look Like? The data analysis process is written into the school plan. There is ongoing coaching provided. “Adult” data are collected to monitor implementation of the data analysis process on a regular basis. There is continuous reflection on and improvement of the data analysis process.

Pre-conditions for successful implementation of the Classroom-Focused Improvement Process (CFIP) Cohesive and collaborative team which shares common subject matter and common assessments Common planning time for the team at a minimum of one hour twice weekly, of which one hour is devoted exclusively to CFIP Principal who is a strong instructional leader and who is comfortable with concept of shared leadership Norms to guide the team’s process of collaborative data analysis Autonomy for the team to adjust teaching practices and interventions based on data from assessments of their students’ learning Use of short-cycle, common assessments by the team Ongoing professional development to enhance the team’s capacity to continually adjust teaching practice in response to student data

Caveats about CFIP It is a paradigm shift from the traditional lesson planning format. It is not easy, especially at first. Teams should follow the steps faithfully until they become second nature. Teams should expect mistakes and imprecision in the data. The results are worth the effort. “CFIP transforms a school.” - Mike Markoe, Washington County Assistant Superintendent for Elementary Education CFIP is NOT presented as a silver bullet. These caveats (Latin from “Caveat emptor,” or “Let the buyer beware) or warnings are offered up front so that teams enter into this process with a full understanding that it is the capacity of the school staff, their full engagement in the process, and their willingness (and ability) to act on the results, that determines whether this model will be a success or failure for them. This is the final slide in Part 1.

THANK YOU for your engagement in this “refresher” on the CFIP process!