Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016.

Slides:



Advertisements
Similar presentations
The Principals Role in Systemic Change for Reading Commitment.
Advertisements

Five -Year Strategic Title I School Plan. Session Objectives Review the five year components utilizing the rubric Organize actions steps to meet the requirements.
School Leadership Team Fall Conference West Virginia Department of Education Division of Educator Quality and System Support Bridgeport Conference Center.
GAPBS Annual Conference Presented By Cynthia Vail, PhD, University of Georgia Katy Gregg, PhD, Georgia Southern University Rebecca Sartor, MEd, Clarke.
Effective Practices for Preventing and Addressing Young Children’s Challenging Behaviors Mary Louise Hemmeter, Ph.D.: University of Illinois at Urbana-Champaign.
Research Findings and Issues for Implementation, Policy and Scaling Up: Training & Supporting Personnel and Program Wide Implementation
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
Changing Practices, Changing Outcomes: Building State Capacity September, 2013.
Using data for program improvement Early Childhood Outcomes Center1.
Research to Practice: Implementing the Teaching Pyramid Mary Louise Hemmeter Vanderbilt University
SUPPORTING CHILDREN USING THE PYRAMID MODEL AND POSITIVE BEHAVIOR SUPPORT Kellie Nketiah Luba Bezborodnikova Claire Wilson Puget Sound Educational Service.
Meeting SB 290 District Evaluation Requirements
Interstate New Teacher Assessment and Support Consortium (INTASC)
Engaged to Learn Scaling Up Recommended Practices
ISLLC Standard #2 Implementation
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Thomas College Name Major Expected date of graduation address
Timberlane Regional School District
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
AdvancED District Accreditation Process © 2010 AdvancED.
PBIS Goes to Preschool: Implementing the Pyramid Model within a PBIS system Julie Betchkal, Pyramid Model Training and Coaching Coordinator
Vermont Early Childhood MTSS
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Annie McLaughlin, M.T. Carol Davis, Ed.D. University of Washington
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
The Center for IDEA Early Childhood Data Systems The Importance of Personnel Data Donna Spiker Co-Director, DaSy Center OSEP 2016 Virtual leadership Conference.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Data-based practitioners: How to use data for decision making Megan Vinh, PhD Abby Winer Schachner, PhD 13 th National Training Institute on Effective.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Supporting Local Data Use for Program Improvement: Where are you now? Abby Schachner Kerry Belodoff Tony Ruggiero Improving Data, Improving Outcomes Conference.
Coaching in Early Intervention Provider Onboarding Series 3
Professional Development: Evaluation Strategies to Maximize Impact
Fostering a Culture of Data Use
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
2015 Leadership Conference “All In: Achieving Results Together”
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Child Outcomes Summary Process April 26, 2017
Janice Creneti Project Coordinator Florida SPDG SIM
Using Formative Assessment
Child Outcomes Summary (COS) Process Training Module
Family-Guided Routines-Based Intervention Introduction Module
Supporting Improvement of Local Child Outcomes Measurement Systems
Data Informed Decision Makers: How to Use Data For Decision Making
National Webinar Presented by: Amy Nicholas Cathy Smyth
High-Leverage Practices in Special Education: Assessment ceedar.org
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
Integrating Outcomes Learning Community Call February 8, 2012
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
Supporting Improvement of Local Child Outcomes Measurement Systems
Let’s Talk Data: Making Data Conversations Engaging and Productive
Using Data for Program Improvement
OSEP Project Directors Meeting July 2018
Parent-Teacher Partnerships for Student Success
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
Using Data for Program Improvement
Child Outcomes Summary (COS) Process Training Module
Christina Kasprzak Frank Porter Graham Child Development Institute
Data Culture: What does it look like in your program?
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016

Who’s in the room? –Who is your role? –What areas do you work in? Welcome

Agenda Key concepts for data-based decision- making, including: –The basics to data-based decision making –Using data to make child level and implementation decisions –Creating a culture of decision making Discussion

THE BASICS TO DATA- BASED DECISION MAKING

A Data-Decision Making Approach: Some Basic Assumptions Outcomes are identified Fidelity and outcomes are measured Data are summarized and used to: –Identify training needs –Deliver professional development –Make other programmatic changes (e.g., playground schedule, program wide expectations) –Problem solve around specific children or issues –Ensure child learning and success Data collection AND ANALYSIS is an ongoing process

Data-Based Decision Making Cycle LOOK (Evidence) THINK (Inference) ACT

Organizing for an effective problem solving conversation Problem Solution Out of Time Use Data A key to collective problem solving is to provide a visual context that allows everyone to follow and contribute

Using Your Data

So, how do I begin? What are your questions? What is your process for looking at data and making interpretations? What are the data sources you might have? Is there other data you need to collect or gather?

Starting with a question (or two…) All analyses are driven by questions Questions come from different sources Different versions of the same question are necessary and appropriate for different audiences. What are your critical questions?

What is Your Process for Looking at Data? Evidence Inference Action

Evidence (Look) Evidence refers to the numbers, such as “31% of children have been removed at least once” The numbers are not debatable

Inference (Think) How do you interpret the evidence? What can you conclude from the numbers? Does evidence mean good news? Bad news? News you can’t interpret? To reach an inference, sometimes you need to analyze data in other ways (ask for more evidence)

Inference (Think) Inference is debatable -- even reasonable people can reach different conclusions Stakeholders and having a variety of perspectives can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data

Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders and teams May involve looking at additional data and information Again, early on the action might have to do with improving the quality of the data

USING DATA: IMPLEMENTATION AND CHILD LEVEL

Two Primary Considerations Are We Doing What We Should be Doing? –Fidelity of implementation Is It Making a Difference? –Impact –Proximal to distal outcomes

Two Examples Reaching Potential Through Recommended Practices (RP2) Pyramid Model

RP² Data-Based Decision-Making Plan Program Implementation RP 2 : Benchmarks of Quality for Home Visiting Programs RP²: Benchmarks of Quality for Classroom Practice Implementation Recommended Practices Observation Scale – Home Visiting (RP² OS-HV) Recommended Practices Observation Scale – Classroom (RP² OS-C) Child Outcome Child Engagement Scale (Dunst & Trivette, 2014) – HV STARE: Scale for Teacher Assessment of Routine Engagement (McWilliam, 2011) - Classroom

Data-Based Decision Making For Teachers: Child Outcomes STARE: Scale for Teacher Assessment of Routine Engagement Identifies child’s level of engagement in learning opportunities (with peers, adults and materials) Teacher completes on Target Child after Target Activity at least 2x per week Growth in engagement for target children Internal coach completes during observation

STARE: Scale for Teacher Assessment of Routine Engagement (McWilliam, 2011)

Child Engagement Scale Child Engagement Scale is designed to be used by home visitors and coaches Engagement is rated after the activity

Child Engagement Scale-Home Visitor Use Identify the target activity or routine that the child and parent have been doing together (one where child engagement is a struggle, where is the home visitor trying to help the parent improve the use of a practice) Home visitor rates each of the indicators after the activity ends Home visitor should share the data with the family and discuss any changes that need to be made (based on the data)

Child Engagement Scale-Coach Use Internal coach uses form in same way the home visitor does After the home visit or video observation, compare your scores with home visitor scores and clarify any definitional issues Conduct observation across time and chart child progress Use your data to provide feedback on changes for the child

What inferences can you make from the data? What is the level of engagement of the child in the target activity? Is the trend indicating improvement, decreasing engagement, variable engagement, no changes? Is there a relationship between child data and practitioner implementation of practices? Does the practitioner’s action plan include strategies/practices that might directly affect child engagement? Are there missing data?

Evaluation Plan-Pyramid Model Implementation Benchmarks of Quality Teaching Pyramid Observation Tool; TPITOS Pre-SET Program Program Incidents (calls to families, dismissals, transfer, requests for assistance, family conferences) Behavior Incident Reports Child Progress Monitoring (see PTR) Child curriculum-based assessment or rating scales

Benchmarks of Quality Example  Establish Leadership Team (6 benchmarks)  Staff Buy-In (2 benchmarks)  Family Involvement (4 benchmarks)  Program-Wide Expectations (5 benchmarks)  Strategies for Teaching and Acknowledging Program- Wide Expectations (3 benchmarks)  Classrooms Demonstrate Adoption (5 benchmarks)  Procedures for Responding to Challenging Behavior (5 benchmarks)  Staff Support Plan (7 benchmarks)  Monitoring Implementation and Outcomes (6 benchmarks) *Scores for critical areas range from 0 (no implementation) to 2 (full implementation

What Inferences can you Make? What elements are fully in place? What elements are not in place or partially in place? Where has there been the most growth? What elements appear to be the ones needing attention? What other data might we want to examine? Are there areas for growth that might be pivotal (e.g. Buy-in, Procedures for Behavior Challenges)

Teacher Implementation Teaching Pyramid Observation Tool (Hemmeter, Fox, & Snyder, 2014)

Pyramid Model Practices TPOT Designed to Measure Observation items 1.Schedules, routines, and activities (SR) 2.Transitions between activities (TR) 3.Supportive conversations (SC) 4.Promoting engagement (ENG) 5.Providing directions (PD) 6.Collaborative teaming (CT) 7.Teaching behavior expectations (TBE) 8.Teaching social skills and emotional competencies (TSC) Observation and interview items 9.Teaching friendship skills (FR) 10.Teaching children to express emotions (TEE) 11.Teaching problem-solving (TPS) Interview items 12.Interventions for children with persistent challenging behavior (PCB) 13.Connecting with families (COM) 14.Supporting Families in using Pyramid Model practices (INF) Observation of Challenging Behavior 32.Strategies for responding to challenging behavior (SCB) Red Flags Items that need immediate attention to create classroom environments and procedures that promote social and emotional competence

What inferences can you make?  What are teacher strengths?  What are areas that are lower?  What other data might inform a decision?  How might these data influence professional development?

What inferences can you make?  What are teacher strengths across classrooms?  What are areas that are lower across classrooms?  What other data might inform a decision?  How might these data influence professional development?

CREATING A CULTURE OF DATA-BASED DECISION MAKING

Data – It’s a Leadership Team Responsibility Monthly review of data –Who, How often, What, Where, When Monthly review of program incidents –What’s up, what’s down, why, what should we do about it Review of all teacher fidelity measures to determine next steps, training, coaching, support Review of child progress data to ensure supports are effective

Cultural Barriers to Data-Based Decision Making 1.Many providers/teachers have developed their own personal metric for judging the effectiveness of their intervention/teaching and often this metric differs from the metrics of external parties (e.g., state accountability systems and school boards). 2.Many providers/teachers and administrators base their decisions on experience, intuition, and anecdotal information (professional judgment) rather than on information that is collected systematically. 3.There is little agreement among stakeholders about which child outcomes are most important and what kinds of data are meaningful. 4.Some providers/teachers disassociate their own performance and that of children, which leads them to overlook useful data. Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

Technical Barriers Data-Based Decision Making 5.Data that providers /teachers want – about “really important outcomes” – are rarely available and are usually hard to measure. 6.Programs and schools rarely provide the time needed to collect and analyze data. Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

Discussion Questions- Small Group Activity What are your barriers to creating a culture of data- based decision making? What are your potential solutions? What do you struggle with in being a data-based decision-maker? What are your potential solutions?

QUESTIONS?

Keeping In Touch Megan Vinh, Lise Fox,

The contents of this presentation were developed under a grant from the U.S. Department of Education, # H373Z120002, and a cooperative agreement, #H326P120002, from the Office of Special Education Programs, U.S. Department of Education. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. DaSy Center Project Officers, Meredith Miceli and Richelle Davis and ECTA Center Project Officer, Julia Martin Eile.