2018 Improving Data, Improving Outcomes Conference August 2018

Slides:



Advertisements
Similar presentations
The Head Start Child Development and Early Learning Framework A Focus on School Readiness for Infant and Toddler Children August 19, 2014 RGV Pre-Service.
Advertisements

The Center for IDEA Early Childhood Data Systems ECTA/DaSy System Framework Self- Assessment June 18, 2015.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Classroom Assessments Checklists, Rating Scales, and Rubrics
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
HECSE Quality Indicators for Leadership Preparation.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Child Outcomes Summary (COS) Process Professional Development Tools
Child Outcomes Data Collection: Results of the ITCA National Survey and Discussion of Policies and Considerations Christina Kasprzak, ECTA/DaSy Cornelia.
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
2015 Leadership Conference “All In: Achieving Results Together”
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Child Outcomes Summary Process April 26, 2017
Phase I Strategies to Improve Social-Emotional Outcomes
Part C Data Managers — Review, Resources, and Relationship Building
Pacific and Caribbean States/Entities Early Intervention and
Using Formative Assessment
National, State and Local Educational Environments Data:
Classroom Assessments Checklists, Rating Scales, and Rubrics
Child Outcomes Summary (COS) Process Training Module
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
National Webinar Presented by: Amy Nicholas Cathy Smyth
Using Data to Reduce Suspension and Expulsion of Young Children
ECTA/DaSy System Framework Self-Assessment
Who Wants to be a 7-Point Rating Scale Millionaire?
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Child Outcomes Data: A Critical Lever for Systems Change
Pay For Success: An Invitation to Learn More
Today’s Word Cloud Session 3 word cloud.
Linking SSIP Implementation and Evaluation: Learning from States
Evaluating Practices Workshop Series
DB Summit 2016 Early Identification/Referral Session
Evaluating Practices Workshop Series
Early Childhood Outcomes Data (Indicator C3 and B7)
2018 OSEP Project Directors’ Conference
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
Cornelia Taylor, Sharon Walsh, and Batya Elbaum
Supporting Improvement of Local Child Outcomes Measurement Systems
Let’s Talk Data: Making Data Conversations Engaging and Productive
7-Point Rating Scale Jeopardy
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
OSEP Project Directors Meeting July 2018
School Readiness and the Assessment of Children with Disabilities
Parent-Teacher Partnerships for Student Success
School Readiness and the Assessment of Children with Disabilities
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Integrating Results into Accountability Procedures and Activities
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Measuring EC Outcomes DEC Conference Presentation 2010
Child Outcomes Summary (COS) Process Training Module
Encore Webinar February 13, 2019, 3:00pm ET
Refresher: Background on Federal and State Requirements
Christina Kasprzak Frank Porter Graham Child Development Institute
Data Culture: What does it look like in your program?
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

2018 Improving Data, Improving Outcomes Conference August 2018 Evaluating Implementation of Evidence-Based Practices: Tips for Improving Quality and Feasibility Margaret Gillis, DaSy Debbie Shaver, DaSy Katy McCullough, ECTA Stacy Kong, HI Part C 2018 Improving Data, Improving Outcomes Conference August 2018

Welcome!

Outcomes Participants will increase understanding of Basic evaluation principles related to evaluating practice implementation Features of high-quality tools and methods for evaluating practice implementation Strategies for summarizing data on practice implementation for different purposes

Agenda What are we measuring and why? How are we measuring? What do we do with all this data? State example: Improving evaluation of Hawaii Part C’s SSIP evidence-based practices

What are we measuring and why?

Practice Implementation Improves Results

Definition of Practices The teachable and doable behaviors that practitioners use with children and families which can be used, replicated, and measured for fidelity The practice needs to be clearly defined in order to be measured.

Definitions Practice Change: Increase or decrease in the number, frequency, precision, or quality of practices implemented by a practitioner as compared across at least two points in time. Fidelity: The extent to which practitioners are implementing an evidence-based program or practice as intended, as measured against a threshold or predetermined level. Fidelity implies strict and continuing faithfulness to the original innovation or practice that is expected to lead to the desired child/family outcomes.

Relationship Between Practice Change and Fidelity Both practice change and practice fidelity are about the implementation of evidence-based practices (EBPs) Are practitioners changing their implementation of EBPs? Are practitioners implementing EBPs as intended? Practitioners can demonstrate changes in practices without reaching fidelity.

Measuring Practice Change and Fidelity Use the same tools or methods to evaluate fidelity versus practice change. Practice change can be evaluated in relation to a fidelity threshold.

Why measure practice change and practice fidelity? To ensure practices are implemented as such that improved outcomes are expected To determine whether practitioners maintain fidelity over time To obtain information for monitoring progress Incremental increases toward fidelity can indicate that improvement strategies are working and highlight areas where practitioners need additional support Corrections/adjustments in practice can be made in a timely manner

Evaluation Plan Components Outcomes Evaluation Questions Performance Indicators Measurement/Data Collection Methods

Alignment of Outcomes, Questions, and Performance Indicators Describe what you intend to achieve as a result of activity(ies) related to EBPs Often interconnected Define steps toward achieving long-term outcomes Evaluation Questions Describe what you need to know to determine if you have achieved the outcome Performance Indicators Describe how you will determine if you have met the outcome Based on measurement data (e.g., extant or new)

Large Group Activity: Alignment Outcome Description Evaluation Question Performance Indicator Providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social emotional needs Do providers know how to use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social-emotional needs? 80% of providers completed training on evidence-based practices to support social emotional skills with parents and infants/toddlers identified with social-emotional needs

Large Group Activity: Alignment Outcome Description Evaluation Question Performance Indicator Providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social emotional needs Do providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social-emotional needs? 80% of providers use evidence-based practices to support social emotional skills with parents and infants/toddlers identified with social-emotional needs

Why We Measure: Key Take-Away Points Practices = behaviors Alignment, alignment, alignment Evaluate practice change and practice fidelity using the same measures Evaluate practices across children, settings, families, programs

How are we measuring?

Defining and Operationalizing Evidence-based Practices Practices that are not clearly defined and operationalized are not measureable A Practice Profile is a way to define and articulate acceptable and non acceptable practices (National Implementation Research Network) A Practice Profile is not a measurement tool but can be used as a starting place to develop or identify a tool for assessing practice change and fidelity

Sample Practice Profile Example Practice Profile for DEC Recommended Practices: Interactions (INT2) Core Component Contribution to the Outcome Expected/Proficient* Developmental Unacceptable INT2. Practitioners promote the child’s social development by encouraging the child to initiate or sustain positive interactions with other children and adults during routines and activities through modeling, teaching, feedback, or other types of guided support. The practitioner promotes positive social interactions so that the child will experience predictable social responses that will contribute to growth and learning. A Head Start teacher helps a peer respond to a child’s gestures by verbally translating his intention and desires, then modeling for the peer how to respond to the child. A Head Start teacher tells peers to be friendly to the child who uses gestures to communicate. . A Head Start teacher assigns her assistant teacher to attend to the child who uses gestures to communicate. Developed by The Early Childhood Recommended Practice Modules project (RPM) (see RPM website at http://rpm.fpg.unc.edu/)

Balancing High-Quality and Practical Measurement The higher the quality of the measurement approach, the more useful and actionable the data are. The key is to ensure you get the most meaningful, accurate, and reliable information possible. Measurement also needs to be doable, practical

Characteristics of High-Quality Measurement The tool is: Aligned to the selected evidence-based practices Valid – accurately measures the essential functions of the practice Reliable – produces consistent information across users, settings, activities, and time points Practical – can be used with the staff and resources available

Characteristics of High-Quality Measurement The tool: Can produce a meaningful fidelity threshold score that indicates whether practitioners have reached a level of implementation that is sufficient for achieving targeted child or family outcomes. Allows for variation (e.g., over time, across practitioners) Provides useful information to practitioners to identify areas of strength and areas for improvement to move toward, reach, and maintain fidelity.

Characteristics of High-Quality Measurement Other considerations to improve quality: Timing/frequency. The tool is used at sufficient frequency to measure incremental improvement (practice change) and the maintenance of fidelity over time. Training. Clear instructions and training are provided to improve data quality. Purpose of data collection is clear--Practitioners and data collectors understand how data will be used to improve implementation.   Use consistent tools/instruments statewide

Data Collection Methods Observation by someone else Practitioner self report Review of documents Multiple methods can be used, including qualitative methods

Large Group Discussion What are the advantages and disadvantages of these data collection approaches? Observation Practitioner self-report Review of documents How have you tried to balance quality and practicality in evaluating practice implementation?

How to Make it Practical Leverage existing processes and structures (e.g., supervisors, coaches) Start with existing tools (e.g., fidelity tools associated with a model, tools related to DEC Recommended Practices) Prioritize the practices to be evaluated—it is not practical to evaluate all practices Start small—select a subgroup of practitioners and/or programs for more intensive measures (e.g., observations)

Measurement: Key Take-Away Points Alignment, alignment, alignment High-quality measurement = meaningful, useful data Independent observation = more objective information than practitioners’ self report Needs to be practical

What do we do with all this data?

Purposes of Data Analysis and Use What do you want to know from your data? What information do decision-makers at each system level need to have to improve practice implementation? The type of decisions made at each level drives the evaluation questions and the appropriate analysis.

Example Questions at Different System Levels Practitioner Level Is the practitioner implementing with fidelity? Local Program Level Are practitioners implementing the EBP with fidelity? Which components of the fidelity tool show the greatest growth across practitioners? The least growth? State Level Are practitioners within local programs implementing the EBP with fidelity? What percentage of programs meet a performance indicator for percentage of practitioners at fidelity?

Data Aggregation Example

Large Group Discussion How are you using/sharing data on practice implementation? Are local programs/sites using data on practice implementation? How? Are you using qualitative data to better understand practice implementation? How?

Data Analysis and Use: Key Takeaways Align analysis to evaluation questions Summarize data for different systems levels Qualitative data can bring quantitative data to life

Hawaii’s SSIP Broad Improvement Strategies Enhance the statewide system of professional development (PD) to ensure implementation of evidence-based practices (EBPs) around Primary Service Provider (PSP) Approach to Teaming and Coaching model in the natural learning environment to support social-emotional (SE) development with fidelity. Increase the capacity of early intervention programs to provide services and supports to address SE development. Enhance the child outcomes summary (COS) process to ensure data are accurate and realizable and ensure program effectiveness to support EBPs to improve children’s SE development.

Hawaii’s SiMR Infants and toddlers with disabilities, in demonstration sites, will have substantially increased their rate of growth in social-emotional skills (including social relationships) by the time they exit early intervention. Summary Statement 1: Of those children who entered or exited the program below age expectations, the percent who substantially increased their rate of growth by the time they turned 3 years of age or exited the program.

Hawaii’s EBPs & Measurement Tools Primary Service Provider Approach to Teaming *Checklist for Implementing a Primary – Coach Approach to Teaming (measures infrastructure) Coaching Model SE Self-Assessment (measures practice change) *Coaching Log Summary Form (measures fidelity) *Hawaii SE Competencies Coaching Log Review (in process) *Shelden & Rush

Revisions based on SSIP Evaluation Practice Change Webinars Added a short-term outcome that was previously missed but vital to the work being done. Deleted a short-term outcome that that was not necessary to track as it was embedded into another outcome. Deleted an intermediate outcome because it was time intensive to track and not necessary.  For all evaluation outcomes, Hawaii elaborated on the measurement/data collection method/measurement interval/data management (Who will enter data; how will data be stored; how will data be entered; when data will be entered; how data will be transmitted) and included analysis descriptions.

Hawaii’s Evaluation Plan - example

Hawaii’s Evaluation Plan - example

Hawaii’s Evaluation Plan - example

Questions/Comments?

Resources Materials from the SSIP Evaluation online workshop series are posted on the DaSy website (including list of existing tools): Evaluation of Implementation of EBP Workshop Resources For more information on data aggregation and use at different systems levels: ECTA Learning Lab--Exploring Coaching for Practice Change: Data Decision-making and the Implementation of Practice-Based Coaching: http://ectacenter.org/~calls/2017/learninglab.asp#session4

Contact Information Margaret.gillis@sri.com Katy.mccullough@unc.edu Debbie.shaver@sri.com

Thank you! Visit the DaSy website at: http://dasycenter.org/ Follow DaSy on Twitter: @DaSyCenter Visit the ECTA website at: http://ectacenter.org/ Follow ECTA on Twitter: @ECTACenter

Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H373Z120002 and #H326P170001. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile. Instructions to presenters: This slide is to be included as the last slide in your deck but you are not expected to show it to the audience. Please be sure to delete these instructions from this slide’s notes page in your presentation.