Data-Driven Instruction Comprehensive Leadership Workshop

Slides:



Advertisements
Similar presentations
How do I tell if my students are growing??. Different Types of Assessment  Observations  Portfolios  Tests.
Advertisements

P1 Data-Driven Instruction Entry Plan Leadership Workshop Paul Bambrick-Santoyo.
Quality, Improvement & Effectiveness Unit
FEBRUARY 26, 2014 (ELEMENTARY) FEBRUARY 27, 2014 (SECONDARY) DCA – Session 2.
School Based Inquiry Team. Deliverable #2 What is Data-Driven Instruction?
Balanced Assessment System. Standards Professional Practice DataCulture.
Across the Curriculum West Jacksonville Elementary A. Bright and L. Derby.
March 1, 2012 Data Driven Instruction: Rigorous Data Meetings- Support & Accountability.
A306 Session 9: Examining Instruction PART II Apr. 8, 2008.
Assessment Checking for understanding. Objectives for the session Review the plethora of assessment options available Reflect on current practices and.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM A Story of Units Module Focus Grade 2- Module 4.
+ Hybrid Roles in Your School If not now, then when?
Professional Learning in the Learning Profession Effective Practice  Increased Student Learning Frederick Brown Director of Strategy.
DRAFT May 25 rd, 2011 Pearson Common Core Positioning and Value Proposition.
Sharon Walpole University of Delaware Michael C. McKenna University of Virginia Literacy Coaches in Action: Strategies for Crafting Building- Level Support.
September 12, 2014 Lora M. McCalister-Cruel BDS District Data Coach Bay District Schools Data Analysis Framework.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Valerie Henry, NBCT, Ed.D. UC Irvine November 14, 2008.
Teaching and Learning Elementary Math November 27, :30 am – 12:30 pm.
Communication Skills Anyone can hear. It is virtually automatic. Listening is another matter. It takes skill, patience, practice and conscious effort.
Elementary School Observation by Victoria DeRoy.
Robert Kaplinsky Melissa Canham
The Guaranteed and Viable Curriculum January 29, 2013 Karen M. Beerer, Ed.D.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
February 10, 2012 Session 3: Effective Leadership in the Common Core February 10, 2012 Session 3: Effective Leadership for the Common Core NYSED Principal.
Blended Learning Design: Revised Recommendation
Conferring With Writers Part II March 28, “ ’Choice leads to voice,’ literacy consultant John Poeton says when talking about writing. We know that.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
The most effective way to achieve deep comprehension. This PPT reproduces most of the article by Nancy Boyle, “Closing in on Close Reading”; ASCD; Educational.
Data Driven School Improvement Plans by: T. Atkinson, Y. Diodati and E. Roussos.
Language of Math Using the Language of Content to increase confidence and competence.
Envisioning Success: Setting the Stage July 24, 2012 Diane Foley, Network Leader Mary Barton, SATIF.
March Madness Professional Development Goals/Data Workshop.
Inquiry Teams for Administrators Dennis C. Taylor Professional Learning Coach Cayuga-Onondaga BOCES Dennis C. Taylor Professional Learning Coach Cayuga-Onondaga.
Studying for Tests Before the Test Be sure to find out ahead of time. –what material the test will cover –what type of test it will be (multiple choice,
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
P.R.I.D.E. School Professional Day :45 am- 3:30 pm.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
The Art of Communication: Instructional Strategies for Gifted Learners
BECOMING CRITICAL THINKERS: Four strategies to use in the classroom.
+ Getting Started: Projects Based Learning Wando High School PD – October 28, 2015 Chris Turpin H222/H230.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
EngageNY.org Module Assessments and Data Cycles. EngageNY.org2 Good Morning! Please Mix It Up… Sit with principals, teachers, and coaches from different.
Study Groups!! Study groups are meant to HELP you learn and perform better on the tests. If you met a lot during 1 st semester and didn’t see many results,
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Assessment Data Professional Collaboration Russ Romans and Mike Loughrey RDA.
Marking and Feedback CPD Student approach to marking.
Data Driven Instruction & School-Based Inquiry What we know to this point…
PLCs in Mount Airy City Schools Purpose of PLCs Collaborative meetings of educators in which data-driven decisions are made to improve teacher’s instruction.
Calibrating Feedback A Model for Establishing Consistent Expectations of Educator Practice Adapted from the MA Candidate Assessment of Performance.
Collaborative Grouping 6-12 Math Teachers. Workshop Outcomes Participants will gain effective strategies for forming and facilitating a classroom culture.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Contact Information O Administration O Principal: Melanie Fishman O Assistant Principal: Lisa Gonsky.
GOING DEEPER INTO STEP 1: UNWRAPPING STANDARDS Welcome!
P1 Summer Data Camp P2 District SLLBOCES Research, Resources, Experts Superintendent Building Administration Leadership Teams Supervisor of Data.
P1. P2 P3 OFF-RAMPS ON THE ACHIEVEMENT HIGHWAY Unaligned interim assessments No structured time in school day Infrequent interim assessments Externally.
Instruction & Learning Plan PROFESSIONAL LEARNING COMMUNITIES PROFESSIONAL LEARNING COMMUNITIES: INSTRUCTION & ASSESSMENT PLAN PROFESSIONAL DEVELOPMENT.
The District Management Council 70 Franklin Street Boston, MA Tel: 877.DMC Springfield Public Schools Springfield Effective.
Information Analysis Dennis C. Taylor Professional Development Unit Cayuga-Onondaga BOCES.
Coaches Presentation Presentation based on the work of Paul Bambrick- Santoyo-Driven by Data DATA-DRIVEN INSTRUCTION.
Module II Creating Capacity for Learning and Equity in Schools: The Mode of Instructional Leadership Dr. Mary A. Hooper Creating Capacity for Learning.
Common Core State Standards: Myths vs. Facts
Welcome to Sharp School’s Data Day!
Write your metaphors on the butcher paper
Course name: Weekly Planning
Sequencing Writing Assignments
Formative Feedback The single most powerful influence on enhancing achievement is feedback. Hattie, 2009 At best, students receive ‘moments’ of feedback.
The Framework for Teaching
Sequencing Writing Assignments
Office of Education Improvement and Innovation
Presentation transcript:

Data-Driven Instruction Comprehensive Leadership Workshop Paul Bambrick-Santoyo

NY State Public School ELA 4th Performance vs. Free-Reduced Rates 100% 90% 80% 70% Pct. Proficient 60% 50% 40% 30% 20% 10% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Pct. Free-Reduced Lunch

NY State Public School ELA 4th Performance vs. Free-Reduced Rates 100% 90% 80% 70% Pct. Proficient 60% 50% 40% 30% 20% 10% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Pct. Free-Reduced Lunch

Case Study: Springsteen Charter School, Part 1 What did Jones do well in his attempt to improve mathematics achievement? What went wrong in his attempt to do data-driven decision making? As the principal at Springsteen, what would be your FIRST STEPS in the upcoming year to respond to this situation? 4

Man on Fire: What were the key moments in Creasy’s attempt to help the girl (Pita)? What made Creasy’s analysis effective? 5

ASSESSMENT ANALYSIS I PART 1—GLOBAL IMPRESSIONS: Global conclusions you can draw from the data: How well did the class do as a whole? What are the strengths and weaknesses in the standards: where do we need to work the most? How did the class do on old vs. new standards? Are they forgetting or improving on old material? How were the results in the different question types (multiple choice vs. open-ended, reading vs. writing)? Who are the strong/weak students? 6

ASSESSMENT ANALYSIS II PART 2—DIG IN: “Squint:” bombed questions—did students all choose same wrong answer? Why or why not? Compare similar standards: Do results in one influence the other? Break down each standard: Did they do similarly on every question or were some questions harder? Why? Sort data by students’ scores: Are there questions that separate proficient / non-proficient students? Look horizontally by student: Are there any anomalies occurring with certain students? 7

Teacher-Principal Role Play ROLE-PLAY ANALYSIS: What did you learn about the teachers? How did the interim assessment and analysis template change the dynamic of a normal teacher/principal conversation? By using this particular assessment and analysis template, what decisions did the principal make about what was important for the student learning at his/her school? 8

Teacher-Principal Role Play META-ANALYSIS: What are the strengths and limitations of this approach to data-driven decision making? What structures are needed to allow such a process to happen? 9

Videos of Teacher-Principal Conference Videotaped 2005-06

Impact of Data-Driven Decision Making North Star Academy State Test & TerraNova Results 2003-2008

Comparison of 02-03 to 03-04: How one teacher improved

Comparison of 02-03 to 03-04: How 2nd teacher improved 6th Grade 2002-2003 -- Percentage at or above grade level TERRANOVA 2002 2003   N=43 students 6th Grade Pre-Test 6th grade CHANGE Reading 53.7% 29.3% - 24.4 Language 51.2% 48.8% - 2.4 6th Grade 2003-2004 -- Percentage at or above grade level 2004 N=42 students 5th grade 40.5% 44.2% + 3.7 79.1% + 38.6

North Star Academy: NJ State Test Results 2009 14

NJASK 8—DOWNTOWN MS LITERACY

NJASK 8—DOWNTOWN MS MATH

North Star Middle Schools: Setting the Standard

North Star Elementary: Exploding Expectations

HIGH SCHOOL HSPA—ENGLISH Comparative Data from 2008 HSPA Exam

HIGH SCHOOL HSPA—MATH Comparative Data from 2008 HSPA Exam

NEW JERSEY HSPA—ENGLISH PROFICIENCY

NEW JERSEY HSPA—MATH PROFICIENCY

Data-Driven Instruction & Assessment Paul Bambrick-Santoyo Day 1 Conclusions Data-Driven Instruction & Assessment Paul Bambrick-Santoyo

Data-Driven Instruction & Assessment Paul Bambrick-Santoyo Day 2 Data-Driven Instruction & Assessment Paul Bambrick-Santoyo

Dodge Academy: Turnaround Through Transparency

Ft. Worthington: Turnaround Through Transparency

Monarch Academy: Vision and Practice

Quick-Write Reflection From what you know right now, what are the most important things you would need to launch a data-driven instructional model in your school? 29

DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: in a Data-driven CULTURE THE FOUR KEYS: DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: ASSESSMENTS ANALYSIS ACTION in a Data-driven CULTURE 30

1. 50% of 20: 2. 67% of 81: 3. Shawn got 7 correct answers out of 10 possible answers on his science test. What percent of questions did he get correct? 4. J.J. Redick was on pace to set an NCAA record in career free throw percentage. Leading into the NCAA tournament in 2004, he made 97 of 104 free throw attempts. What percentage of free throws did he make? 5. J.J. Redick was on pace to set an NCAA record in career free throw percentage. Leading into the NCAA tournament in 2004, he made 97 of 104 free throw attempts. In the first tournament game, Redick missed his first five free throws. How far did his percentage drop from before the tournament game to right after missing those free throws? 6. J.J. Redick and Chris Paul were competing for the best free-throw shooting percentage. Redick made 94% of his first 103 shots, while Paul made 47 out of 51 shots. Which one had a better shooting percentage? In the next game, Redick made only 2 of 10 shots while Paul made 7 of 10 shots. What are their new overall shooting percentages? Who is the better shooter? Jason argued that if Paul and J.J. each made the next ten shots, their shooting percentages would go up the same amount. Is this true? Why or why not?

ASSESSMENT BIG IDEAS: Standards (and objectives) are meaningless until you define how to assess them. Because of this, assessments are the starting point for instruction, not the end. 32

LITTLE RED RIDING HOOD: ASSESSMENTS: LITTLE RED RIDING HOOD:   1. What is the main idea? 2. This story is mostly about: A. Two boys fighting B. A girl playing in the woods C. Little Red Riding Hood’s adventures with a wolf D. A wolf in the forest 3. This story is mostly about: A. Little Red Riding Hood’s journey through the woods B. The pain of losing your grandmother C. Everything is not always what it seems D. Fear of wolves

ASSESSMENTS: Subject-Verb Agreement   He _____________ (run) to the store. Michael _____________ (be) happy yesterday at the party. Find the subject-verb agreement mistake in this sentence: Find the grammar mistake in this sentence: Find the six grammar and/or punctuation mistakes in this paragraph:

In an open-ended question, the rubric defines the rigor. ASSESSMENT BIG IDEAS: In an open-ended question, the rubric defines the rigor. In a multiple choice question, the options define the rigor. 35

ASSESSMENTS: Solve the following quadratic equation: 2. Give the following rectangle with the lengths shown below, find the value of x: Area = 6

ASSESSMENTS: PRINCIPLES FOR EFFECTIVE ASSESSMENTS: COMMON INTERIM: At least quarterly Common across all teachers of the same grade level DEFINE THE STANDARDS—ALIGNED TO: To state test (format, content, & length) To instructional sequence (curriculum) To college-ready expectations 37

ASSESSMENTS: PRINICIPLES FOR EFFECTIVE ASSESSMENTS: REASSESSES: Standards that appear on the first interim assessment appear again on subsequent interim assessments WRONG ANSWERS: Illuminate misunderstanding TRANSPARENT: Teachers see the assessments in advance 38

DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: THE FOUR KEYS: DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: ASSESSMENTS (Interim, Aligned, Reassess, Transparent) ANALYSIS ACTION in a Data-driven CULTURE 39

DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: THE FOUR KEYS: DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: ASSESSMENTS (Interim, Aligned, Reassess, Transparent) ANALYSIS ACTION in a Data-driven CULTURE 40

ASSESSMENTS: Reading Decisions LEVELED VS. SKILLS: Will your interim assessment develop around reading levels or reading skills?

Leveled Assessment Debate Grade-Level Assessments PROS: Predict results on external assessments Measure student achievement against grade-level standard Ensure school maintains high standards and expectation that all students will reach grade level CONS: If a student is significantly behind in level, offers little information to inform instruction Difficult to see incremental (monthly or quarterly) reading gains Because text is often inaccessible to students, little data can be gathered on strengths and weaknesses by standard Demoralizing for students to constantly fail Leveled Reading Assessments PROS: Shows growth along the leveled-text continuum—possible to see monthly gains toward grade-level standard Because the text is at an accessible level, gives data on individual reading standards Motivates students and engenders student ownership of learning process Confirms student reading levels for teachers Assessment levels correspond to book levels CONS: Does not predict results on external assessments If not supplemented by grade-level assessments, could lower standards and expectations for the school

ASSESSMENTS: Writing RUBRIC: Take a good one, tweak it, and stick with it ANCHOR PAPERS: Write/acquire model papers for Proficient and Advanced Proficient that will be published throughout the school & used by teachers GRADING CONSENSUS: Grade MANY student papers together to build consensus around expectations with the rubric DRAFT WRITING VS. ONE-TIME DEAL: Have a balance

ASSESSMENTS: High School HIGH SCHOOL PROFICIENCY VS. COLLEGE READINESS: Preparing for HS state test and ACT/SAT/AP/college-level work SOLID SHORT PAPERS VS. RESEARCH PAPER MATH: Textbook vs. Application vs. Conceptual understanding

ASESSMENT ANALYSIS: Exercise TASK: Compare State assessment with interim assessment USE ASSESSMENT ANALYSIS SHEET TO ANSWER: Are they aligned in CONTENT? What is the interim assessment missing? Are they aligned in FORMAT/LENGTH? Are they COLLEGE-READY expectations?

Case Study: Douglass Street School Did Krista Brown meet the challenge of 15-point gains? What percentage of teachers do you think made the gains? Which teachers did not? Why? Based on your answers, name the biggest stumbling blocks to school’s success. Based on your answers, name the most important drivers of school improvement.

TRADITIONAL SYSTEMS: Principal-centered HOW TO EVALUATE TEACHER EFFECTIVENESS: How dynamic the lesson appeared to be How well you control the kids How good the curriculum guide / scope & sequence are (“well-intended fiction”—Jacobs) “What is the real curriculum?” “The textbook.” What the teacher teaches and how “good” their pedagogical choice was

DATA-DRIVEN CULTURE: VISION: Established by leaders and repeated relentless TRAINED LEADERSHIP TEAM: “real” leaders and formal leaders involved in process CALENDAR: Calendar in advance with built-in time for assessments, analysis & action PROFESSIONAL DEVELOPMENT: Aligned 48

ASSESSMENTS (Aligned, Interim, Reassess, Transparent) THE FOUR KEYS: ASSESSMENTS (Aligned, Interim, Reassess, Transparent) ANALYSIS ACTION in a Data-driven CULTURE (Vision, Leadership, Calendar, PD) 49

Moving from the “What” to the “Why” Analysis, Revisited Moving from the “What” to the “Why”

Man on Fire: What made Creasy’s analysis effective? After a solid analysis, what made Creasy’s action plan effective? 51

ANALYSIS: IMMEDIATE: Ideal 48 hrs, max 1 wk turnaround BOTTOM LINE: Includes analysis at question level, standards level and overall—how well did the students do as a whole TEST-IN-HAND analysis: Teacher & instructional leader together TEACHER-OWNED analysis DEEP: Moves beyond “what” to “why” 52

ASSESSMENTS (Aligned, Interim, Reassess, Transparent) THE FOUR KEYS: ASSESSMENTS (Aligned, Interim, Reassess, Transparent) ANALYSIS (Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) ACTION in a Data-driven CULTURE (Vision, Leadership, Calendar, PD) 53

Running Effective Analysis Meetings

PRECURSORS TO EFFECTIVE ANALYSIS MTGS Did teachers see the assessment in advance? (TRANSPARENCY) Did they mark it up: Confident, Not Sure, No Way? (TEST-IN-HAND, TEACHER-OWNED) Did you train teachers in analysis strategies? (PROF DEVT, DEEP) Did they fill out an analysis sheet? Did they answer the fundamental question: WHY the students did not learn it? (TEACHER-OWNED, DEEP) Did they have to fill out an action plan? Did you model how to fill out an action plan using these analysis questions? (ACTION PLAN, ACCOUNTABILITY) 55

PRE-CURSORS TO EFFECTIVE ANALYSIS CONT. Did you model a poor and a good conversation so they hear your expectations? (PROF DEVT, DEEP) Did you analyze their results (above and beyond them analyzing their own) in preparation for the meeting? (LEADERSHIP) Did you collect their analysis ahead of time and see if it looked acceptable? (LEADERSHIP, ACCOUNTABILITY) Did you have a plan ready to access content experts if the problems were beyond your expertise? (PROF DEVT) 56

TIPS FOR EFFECTIVE ANALYSIS MEETINGS: Let the data do the talking Let the teacher do the talking (or get them to!) Always go back to the test and back to specific questions Don’t fight the battles on ideological lines (you’re going to lose) There’s a difference between the first assessment and the third You’ve got to know the data yourself to have an effective meeting Make sure it’s connected to a concrete plan that you can verify 57

ANALYSIS MEETING HELPFUL PHRASES: HELPFUL STARTERS FOR ANALYSIS MEETINGS: “So…what’s the data telling you?” “Congratulations on the improvement from last time in x area! You must be really proud of their growth here.” “So the _____ [paraphrase their frustration: the test was hard, the students were difficult, etc.]? I’m sorry to hear that. So where should we begin with our action plan moving forward?” 58

ANALYSIS MEETING HELPFUL PHRASES: DATA-FOCUSING FOR ANALYSIS MEETINGS: “So let’s look at question 18…..Why do you think they got it wrong?” “You know, I thought it might be a silly mistake, but what surprised me is that they did really well on questions x & y. Why do you think they did so well on these questions and yet not on your original question?” “Let’s look at question 11. What did the students need to be able to do to answer that question effectively? Is this more than they are able to do with you in your class?” [When new ideas occur or deeper analysis is done at the meeting than what teacher did previously] “So let’s re-visit the action plan you created and see how we can incorporate these additional ideas.” 59

The Language of Leadership “That’s nice, but tell me again: what’s the point of all this?

Data-Driven Instruction & Assessment Paul Bambrick-Santoyo Day 1 Conclusions Data-Driven Instruction & Assessment Paul Bambrick-Santoyo

Greater Newark Academy Charter School DATA-DRIVEN RESULTS: Greater Newark Academy Charter School 8th Grade GEPA Results   Language Arts Mathematics Year Tested % Proficient / Adv Proficient GNA 2004 46.3 7.3

Greater Newark Academy Charter School DATA-DRIVEN RESULTS: Greater Newark Academy Charter School 8th Grade GEPA Results   Language Arts Mathematics Year Tested % Proficient / Adv Proficient GNA 2004 46.3 7.3 GNA 2005 63.2 26.3

Greater Newark Academy Charter School DATA-DRIVEN RESULTS: Greater Newark Academy Charter School 8th Grade GEPA Results   Language Arts Mathematics Year Tested % Proficient / Adv Proficient GNA 2004 46.3 7.3 GNA 2005 63.2 26.3 GNA 2006 73.5

Greater Newark Academy Charter School DATA-DRIVEN RESULTS: Greater Newark Academy Charter School 8th Grade GEPA Results   Language Arts Mathematics Year Tested % Proficient / Adv Proficient GNA 2004 46.3 7.3 GNA 2005 63.2 26.3 GNA 2006 73.5 GNA 2007 80.1 81.8

Greater Newark Academy Charter School DATA-DRIVEN RESULTS: Greater Newark Academy Charter School 8th Grade GEPA Results   Language Arts Mathematics Year Tested % Proficient / Adv Proficient GNA 2004 46.3 7.3 GNA 2005 63.2 26.3 GNA 2006 73.5 GNA 2007 80.1 81.8 Difference 2004-07 + 33.8 + 74.5 Newark Schools 2006 54.5 41.5 NJ Statewide 2006 82.5 71.3

Greater Newark Charter: Achievement by Alignment

Morell Park Elementary School: Triumph in Planning

Holabird Academy: Coaching to Achievement

Chicago International Charter: Winning Converts

Excellence Charter School—3rd Grade Math *District, NYC, and state results are for 2006.

E.L. Haynes Charter School: Scheduled to Succeed

Capitol Heights Elementary: Data in the Blue Book

The Language of Leadership “That’s nice, but tell me again: what’s the point of all this?

One-Minute Responses—Agenda: Small Groups/Pairs—Deliver Speeches (10 min): Deliver responses to each other Give feedback: What message did you hear (verbally and non-verbally)? What had the biggest impact? What would you change/improve? Write down responses that work

Mr. Holland’s Opus: What made the difference? How did Lou Russ finally learn to play the drum? What changed Mr. Holland’s attitude and actions? 77

ACTION: PLAN new lessons based on data analysis ACTION PLAN: Implement what you plan (dates, times, standards & specific strategies) LESSON PLANS: Observe changes in lesson plans ACCOUNTABILITY: Observe changes classroom observations, in-class assessments ENGAGED STUDENTS: Know end goal, how they did, and what actions they’re taking to improve 78

ASSESSMENTS (Aligned, Interim, Reassess, Transparent) THE FOUR KEYS: ASSESSMENTS (Aligned, Interim, Reassess, Transparent) ANALYSIS (Quick, Bottom line, Teacher-owned, Test-in-hand, Deep) ACTION (Action Plan, Accountability, Engaged) in a Data-driven CULTURE (Vision, Leadership, Calendar, PD) 79

Results Meeting Protocol Effective Group Meeting Strategy

ACTION: RESULTS MEETING 50 MIN TOTAL ACTION: RESULTS MEETING IDENTIFY ROLES: Timer, facilitator, recorder (2 min) IDENTIFY OBJECTIVE to focus on (2 min or given) WHAT WORKED SO FAR (5 min) [Or: What teaching strategies did you try so far] CHIEF CHALLENGES (5 min) BRAINSTORM proposed solutions (10 min) [See protocol on next page] REFLECTION: Feasibility of each idea (5 min) CONSENSUS around best actions (15 min) PUT IN CALENDAR: When will the tasks happen? When will the teaching happen? (10 min)

RESULTS MEETING STRUCTURE: PROTOCOLS FOR BRAINSTORMING/CONSENSUS PROTOCOL FOR BRAINSTORMING: Go in order around the circle: each person has 30 seconds to share a proposal. If you don’t have an idea, say “Pass.” No judgments should be made; if you like the idea, when it’s your turn simply say, “I would like to add to that idea by…” Even if 4-5 people pass in a row, keep going for the full brainstorming time. PROTOCOL FOR REFLECTION: 1 minute—Silent personal/individual reflection on the list: what is doable and what isn’t for each person. Go in order around the circle once: depending on size of group each person has 30-60 seconds to share their reflections. If a person doesn’t have a thought to share, say “Pass” and come back to that person later. No judgments should be made.

RESULTS MEETING STRUCTURE: PROTOCOLS FOR BRAINSTORMING/CONSENSUS PROTOCOL FOR CONSENSUS/ACTION PLAN: ID key actions from brainstorming that everyone will agree to implement Make actions as specific as possible within the limited time ID key student/teacher guides or tasks needed to be done to be ready to teach—ID who will do each task Spend remaining time developing concrete elements of lesson plan: Do Now’s Teacher guides (e.g., what questions to ask the students or how to structure the activity) Student guides HW, etc. NOTE: At least one person (if not two) should be recording everything electronically to send to the whole group

TOPIC CHOICES FOR RESULTS MEETING: FIRST PD SESSION WITH ENTIRE FACULTY: Design the agenda for the whole-staff meeting introducing the data-driven instructional model you will launch Assume that the school has done very little in this area, and the teachers associate “data-driven” instruction with state testing and test prep FIRST TEAM MEETING: Design the agenda for the first meeting with the grade-level team that you will lead during your residency Assume that the team has done very little in this area, and the teachers associate “data-driven” instruction with state testing and test prep COLLEGE READINESS: For high school administrators, design the steps you will take to adapt your city or state assessments to prepare students to succeed at the college level ADAPT CITY EXAMS: Finally, if your city has mandatory exams, design the steps you will take to bring these exams into alignment with your end goal tests.

ACTION: RESULTS MEETING 50 MIN TOTAL ACTION: RESULTS MEETING IDENTIFY ROLES: Timer, facilitator, recorder (2 min) IDENTIFY OBJECTIVE to focus on (2 min or given) WHAT WORKED SO FAR (5 min) [Or: What teaching strategies did you try so far] CHIEF CHALLENGES (5 min) BRAINSTORM proposed solutions (10 min) [See protocol on next page] REFLECTION: Feasibility of each idea (5 min) CONSENSUS around best actions (15 min) PUT IN CALENDAR: When will the tasks happen? When will the teaching happen? (10 min)

Dealing with Challenging Situations Start Up Scenarios Dealing with Challenging Situations

ENTRY SCENARIOS: DEALING WITH CHALLENGING SITUATIONS YOUR TEACHER TEAM DOES NOT HAVE ANY INTERIM ASSESSMENTS: You are placed with a teacher team at a grade level for which there are no citywide interim assessments, and the school doesn’t have any either. What do you do? YOUR DISTRICT HAS POOR MANDATED INTERIM ASSESSMENTS: Your district has an interim assessment in November and April, and your state test is in June. Not only are the interim assessments too far apart, as you review them you realize that they only cover about half of the standards that will be on the state assessment, and they don’t include any open-ended responses. What do you do?

ENTRY SCENARIOS: DEALING WITH CHALLENGING SITUATIONS JADED LEAD TEACHER: You are working with a team of teachers and do your opening PD with them around data-driven instruction, and the younger teachers seem very interested in working on this. But the oldest teacher on the team (who has a very important influence on everyone else) makes very dismissive comments about how this is a waste of time. You give your one-minute response about the importance of this work, but you can see that the newer teachers’ enthusiasm drops. What do you do?

ENTRY SCENARIOS: DEALING WITH CHALLENGING SITUATIONS ASSESSMENTS ARE UNALIGNED WITH INSTRUCTIONAL SEQUENCE: You notice that the interim assessments in your city are not aligned with the instructional sequence that the teachers are mandated to follow. What do you do? NO GOOD ANALYSIS: Your district takes too long to produce a data report, and you have no analysis templates to use. What do you do?

Data-Driven Instruction & Assessment Paul Bambrick-Santoyo Burning Questions Data-Driven Instruction & Assessment Paul Bambrick-Santoyo

Data-Driven Instruction & Assessment Paul Bambrick-Santoyo Conclusions Data-Driven Instruction & Assessment Paul Bambrick-Santoyo