Pearson Copyright 2007 Successful Literacy Coaching: Using Data to Enhance Literacy Instruction IRA May 6, 2008.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

PLP Circle of Support: A prevention/intervention model December 12, 2003 Rhode Island Department of Education.
PERSONAL LITERACY PLANS AT THE SECONDARY LEVEL December 12, 2003.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Strand C – In Depth. Competencies 8 and 9: The teacher demonstrates a willingness to examine and implement change, as appropriate. The teacher works productively.
Collecting and Analyzing Data to Inform Action. Stage 2: A theory of action for your project Exploring research and best practices to provide a strong.
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Value Added Assessment RAD Reading Assessment Teacher Moderation Greg Miller Supervisor of Assessment Lynda Gellner Literacy Consultant Juanita Redekopp.
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
Oregon Reading First: Statewide Mentor Coach Meeting February 18, 2005 © 2005 by the Oregon Reading First Center Center on Teaching and Learning.
What Can We Do to Improve Outcomes? Identifying Targets of Opportunity Roland H. Good III University of Oregon WRRFTAC State.
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
Eugene Field Elementary School “Inspiring and empowering each other to positively impact our community and our world.” Our Journey to Responsive Intervention.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Problem Solving Model Problem Solving Model NC DPI Summer Preparation Preparation & Implementation Implementation North Carolina.
LEADING ONLINE: An autoethnography focused on leading an instructional focus on student learning in an online school DOCTOR OF EDUCATION WASHINGTON STATE.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Leveraging Educator Evaluation to Support Improvement Planning Reading Public Schools Craig Martin
Examining Monitoring Data
Welcome Oregon Scaling-up EBISS The District Data Team Meeting Blending Behavioral and Academic Multi-tiered Systems of Support Oregon.
Hanmer School – Margaret Zacchei Highcrest School – Maresa Harvey Webb School – Michael Verderame Emerson-Williams School – Neela Thakur Charles Wright.
Presenting Data and Getting the Most Out of It. 4 Corner Activity.
Phase IV: Taking Action and Monitoring Implementation and Impact.
OSSE School Improvement Data Workshop Workshop #1 January 30, 2015 Office of the State Superintendent of Education.
0 1 1 TDOE’s accountability system has two overarching objectives and Growth for all students, every year Faster growth for those students who are furthest.
Guidance from the CSDE on SRBI Implementation May 14, 2010 CAPSS Assistant Superintendents’ Meeting Mary Anne Butler, Education Consultant Iris White,
Timberlane Regional School District
Chapter 2 Observation and Assessment
Michael C. McKenna University of Virginia Sharon Walpole University of Delaware Assessment-Driven Reading Instruction.
EDUCATIONAL BENEFIT: MODULE II Tonya Green February 26-27, 2012 Mississippi Department of Education.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
Goose Creek CISD Special Education Districtwide Staff Development Conference February 15, 2013.
OVERVIEW PRESENTATION
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
“Keeping Our Promise” “We believe that supporting policies that enhance the teacher as the key provider and decision maker in reaching all students with.
Informal Assessment: Informing Instruction C&I 222 Monday, October 12, 2011.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Understand the purpose and benefits of guiding instructional design through the review of student work. Practice a protocol for.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
Response To Intervention “Collaborative Data Driven Instruction at Lewis & Clark Elementary” Owen Stockdill.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
WORKING TOGETHER TO HELP CHILDREN SUCCEED RESPONSE to Instruction & Intervention.
Welcome Everyone!. Informal Agenda  Quick Trip With Jo Beck on navigating WebEx  Who you are and what you hope to get out of being a cohort  RTI/Measurement.
Planning for Success Advancing district planning practices MASS/MASC Joint Conference November 5, 2014 Carrie Conaway, Associate Commissioner Planning.
Literacy Assessments Literacy Workgroup Marcia Atwood Michelle Boutwell Sue Locke-Scott Rae Lynn McCarthy.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Arizona State Systemic Improvement Plan Update State Performance Plan / Annual Performance Report  All indicators are still significant and will be.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Deepening Student Impact Via Instructional Practice Data Joe Schroeder, PhD Associate Executive Director, AWSA.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
Facilitator: Angela Kapp Authentic Assessment Session 1 Session 1 Level 2 Minnesota Department of Human Services.
WORKING TOGETHER TO HELP CHILDREN SUCCEED
Response to Intervention & Positive Behavioral Intervention & Support
Fostering a Culture of Data Use
Child Outcomes Summary Process April 26, 2017
The Continuum of Interventions in a 3 Tier Model
Using Formative Assessment
The Principles of Data Use in a RtI / 3-Tier Model
Diagnosis and Remediation of Reading Difficulties
Overview: Understanding and Building a Schoolwide Assessment Plan
Two Rivers Collaborative Inquiry Data Meeting
Leaving No Child Behind: Response to Intervention
Presentation transcript:

Pearson Copyright 2007 Successful Literacy Coaching: Using Data to Enhance Literacy Instruction IRA May 6, 2008

Pearson Copyright 2007 Objectives Discuss barriers to developing a culture of data driven decision making Identify the purpose of different categories of assessments Investigate a collaborative process for data driven decision making

Pearson Copyright 2007 Agenda I.The Numbers Game II.What is a Culture of Data Use? III.The Importance of Collaboration IV.What Data Should We Use? V.Collaborative Data Analysis

Pearson Copyright 2007 Where Do We Start? What are we supposed to do with all this stuff ???

Pearson Copyright 2007 What is a Culture of Data Use?

Pearson Copyright 2007 Data-Driven Decision Making Identify Issues What do we need to know more about? Identify Issues What do we need to know more about? Collect Data What data do we need? Go get it! Collect Data What data do we need? Go get it! Analyze Data Look at it! Analyze Data Look at it! Develop Goals Why is this happening? What do we want to do about it? Develop Goals Why is this happening? What do we want to do about it? Design Action Plan How are we going to do it? Design Action Plan How are we going to do it? Implement Action Plan Go do it! Implement Action Plan Go do it! Monitor/Assess/Revise How are we doing? Monitor/Assess/Revise How are we doing? an integrated, collaborative, and iterative process At appropriate times … Communicate Results Whom should we inform and how? Participant Guide, p. 3

Pearson Copyright 2007 Data-Driven Decision Making Identify Issues What do we need to know more about? Identify Issues What do we need to know more about? Collect Data What data do we need? Go get it! Collect Data What data do we need? Go get it! Analyze Data Look at it! Analyze Data Look at it! Develop Goals Why is this happening? What do we want to do about it? Develop Goals Why is this happening? What do we want to do about it? Design Action Plan How are we going to do it? Design Action Plan How are we going to do it? Implement Action Plan Go do it! Implement Action Plan Go do it! Monitor/Assess/Revise How are we doing? Monitor/Assess/Revise How are we doing? an integrated, collaborative, and iterative process At appropriate times … Communicate Results Whom should we inform and how?

Pearson Copyright 2007 What makes it so difficult to build this culture?

Pearson Copyright 2007 Cognitive Dissonance Cognitive dissonance theory tells us that to reduce stress, human beings strive for congruence between their behavior and beliefs. Psychologists call the tension a person feels when actions are not consistent with beliefs cognitive dissonance. Do your existing school processes leverage our inherent desire for cognitive congruence as a force for positive change? Participant Guide, p. 4

Pearson Copyright 2007 Barriers to Effective Data Use Cultural Barriers Technical Barriers Political Barriers Welcome to Data Driven Decision Making! Participant Guide, p. 5

Pearson Copyright 2007 Barriers to Effective Data Use Cultural Barriers –Personal metrics for judging teaching differ from metrics of external parties –Decisions are based on experience and intuition, rather than on systematically collected information –Disagreement about which student outcomes and data are important –Teachers may disassociate own performance from student performance (Ingram, Seashore & Schroeder, 2004)

Pearson Copyright 2007 Barriers to Effective Data Use Technical Barriers –Data teachers want is rarely available and difficult to measure –Inadequate time to collect and analyze data Political Barriers –Data have often been used politically, leading to mistrust of data and data avoidance (Ingram, Seashore & Schroeder, 2004)

Pearson Copyright 2007 Tips on Using Data Safely Do not use data primarily to identify or eliminate poor teachers. Inundate practitioners with success stories that include data. Collect and analyze data collaboratively and anonymously by team, department, grade level, or school. Allow teachers, by school or team, as much autonomy as possible in selecting the kind of data they think will be most helpful. (Schmoker, 1999)

Pearson Copyright 2007 The calibration process allows stakeholders to explore personal positions on important questions such as, what should students learn and how will we know learning has happened, with the stated aim of arriving at a group (e.g., team, school, or district) set of common standards and definitions. (Springfield & Wayman, 2006) The Foundation of the Culture

Pearson Copyright 2007 The calibration process allows stakeholders to explore personal positions on important questions such as, what should students learn and how will we know learning has happened, with the stated aim of arriving at a group (e.g., team, school, or district) set of common standards and definitions. The Foundation of the Culture (Springfield & Wayman, 2006)

Pearson Copyright 2007 What Data Should We Use?

Pearson Copyright 2007 Data Domains Demographic Perceptual Student Learning School Processes Information about who we are Information about how we workInformation about how we think Outcomes (Bernhardt, 2003) Participant Guide, p. 6

Pearson Copyright 2007 Four Categories of Reading Assessments Outcome:Provide a bottom-line evaluation of the effectiveness of the reading program Screening:Determine which students are at risk for reading difficulty and who will need additional intervention Diagnostic:Help teachers plan instruction by providing in-depth information about students skills and instructional needs Progress Monitoring: Determine if students are making adequate progress or need more intervention to achieve grade-level reading outcomes

Pearson Copyright 2007 Four Categories of Reading Assessments Outcome:AIMS (Arizona Instrument to Measure Standards), TerraNova, District Benchmarks, DIBELS (Dynamic Indicators of Basic Early Literacy Skills) Screening:DIBELS, KIST (Kindergarten Individual Screening Test), Reading Series Phonics Screening Diagnosis:DRA2 (Developmental Reading Assessment), Words their Way Qualitative Spelling Inventory, QRI (Qualitative Reading Inventory), teacher-created assessment Progress Monitoring: DIBELS, DRA2, district-created prescriptive assessments

Pearson Copyright 2007 Reading Assessment Categorize the assessments used in your school. 1. Outcome 2. Screening 3. Diagnosis 4. Progress Monitoring Participant Guide, p. 7

Pearson Copyright 2007 Collaborative Data Analysis

Pearson Copyright 2007 Looking at Data Predict –What do we think well see? Observe –What do we see? Infer –What does this mean for us now and in the future? Predict Observe Infer Participant Guide, p. 8

Pearson Copyright 2007 Predict Observe Infer Predicting What are our assumptions? What do we predict we will see? What questions will we ask?

Pearson Copyright 2007 Predict Observe Infer Observing What important information pops out? What patterns and trends do you see? What seems surprising? What seems odd or confusing?

Pearson Copyright 2007 Predict Observe Infer Inference What inferences or explanations can we make? What questions are raised? What additional data should we explore?

Pearson Copyright 2007 Lets try it… Observe Predict Infer

Pearson Copyright 2007 Prediction I expect to see growth in enrollment over the four years of data that I have.

Pearson Copyright 2007 What Do We See? Review the data. Make an observation. Share observations with colleagues. Observe… Participant Guide, p. 9

Pearson Copyright 2007 Riverview School District Enrollment Observe…

Pearson Copyright 2007 Riverview School District Enrollment

Pearson Copyright 2007 Guidelines for Making Inferences Review your observations to see if there are any commonalities or patterns. Write any new questions that arise from reviewing your observations. Make inferential statements: –I wonder if _________ is the cause of ________. –I think _________ might be happening because of _____. Your inferences and questions may lead to the next analysis, or can be tabled for later analysis.

Pearson Copyright 2007 Questions to Guide Analysis

Pearson Copyright 2007 Questions to Guide Analysis Is the assessment valid and reliable? What is the purpose of the assessment? What information does the data provide? Is the information adequate to make a decision? Do we need additional information? How can we get the information we need? What does the data tell us about student learning? What does the data tell us about instruction? Next steps?

Pearson Copyright 2007 How Does This Apply to a Classroom? Teachers can research what failing (or high achieving) students have in common. Teachers can monitor discipline issues and patterns with a simple data collection plan. An observer can gather data about which students (or parts of a classroom) get the most attention or instruction.

Pearson Copyright 2007 How Does This Apply to a Classroom? Teachers collect more useful data in a day than they may realize: Every classroom assessment should provide useful data. Teachers can use standards based assessments to monitor individual student progress toward meeting objectives. Grade level teams can use common assessments to provide useful data about the progress of all of their students.

Pearson Copyright 2007 Taking it Back to School What can you apply from todays session immediately upon your return? What additional information would you like to have? How will you share the strategic approach with your teachers?

Pearson Copyright 2007 References Bernhardt, V. (2003). Using data to improve student learning. Larchmont, NV: Eye on Education. Ingram, D., Seashore Louis, K., Schroeder, R. (2004). Accountability Policies and Teacher Decision Making: Barriers to the Use of Data to Improve Practice Teachers College 106: 6, Schmoker, M. (1999). Results: The key to continuous school improvement, 2nd Ed. Alexandria, VA: ASCD. Stringfield, S. & Wayman, J.C. (2006). Data Use for School Improvement: School Practices and Research Perspectives. American Journal of Education 112:4,

Pearson Copyright 2007 Debbie Fast, Instructional Specialist, Chandler Unified District Angee Lewandowski, Literacy Coach, Chandler Unified District Alesha Henderson, Literacy Specialist, Pearson Carey Regur, Director of Instructional Services, Pearson