Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating activities intended to improve the quality of Child Outcomes Data August 2016.

Similar presentations


Presentation on theme: "Evaluating activities intended to improve the quality of Child Outcomes Data August 2016."— Presentation transcript:

1 Evaluating activities intended to improve the quality of Child Outcomes Data August 2016

2 Presenters Katrina Martin, DaSy/ECTA Kellen Reid, DaSy Kate Rogers, VT 619 Coordinator Lauren Barton, DaSy/ECTA Ruth Chvojicek, WI Birth – Three Program Kathi Gillaspy, DaSy/ECTA

3

4 Activity Wrap Up What to Expect in Today’s Session BeforeDuringAfter Components to Consider Example Evaluation Questions Evaluation Tools Components to Consider Example Evaluation Questions Evaluation Tools Components to Consider Example Evaluation Questions Evaluation Tools COS Data Quality Activities

5 BeforeDuringAfter Professional Development Policies and Procedures Engaging in the COS Process Ongoing support to and oversight of staff engaging in the COS process Data Analysis Supporting Local Use of COS data

6 Before Implementing the COS Process Components to consider – Policies and procedures – Professional development system Example evaluation questions – To what extent are child outcomes professional development opportunities available and of high quality? – To what extent do providers have adequate training and knowledge on the COS process and state policies and procedures? Evaluation tools

7 Before Evaluation Question 1 To what extent are child outcomes professional development opportunities available and of high quality?

8 Planning/ Evaluation tools Training calendars Surveys from events

9 High-Quality Professional Development Observation Checklist Preparation Introduction Demonstration Engagement Evaluation/ Reflection Mastery

10 Before Evaluation Question 2 Do providers have adequate training and knowledge on the COS process and state policies and procedures?

11 Provider Survey Originally developed by the ENHANCE project to specifically explore providers’…ENHANCE project – experiences with the COS process and – training and support received Recently adapted and used by VT B/619 in collaboration with DaSy and ECTA

12 Part B Section 619 Provider Survey Embedded Training & Knowledge Questions Embedded Policy & Procedure Questions

13 VT Survey Results & Next Steps

14

15 COS Competency Check Online tool Assess provider understanding of the essential knowledge and skills needed to participate in the COS. Structure: 2 levels – Level 1: Screener – Level 2: Application

16 During Implementation of the COS process Components to consider – Quality practices while engaging in the COS process – Coaching and support to staff surrounding the process Example evaluation questions – To what extent do local programs have systems in place that support high quality implementation of the COS process? – To what extent do local teams implement the COS according to policies and procedures? – How well do individuals implement high quality COS team collaboration practices? – To what extent are the COS ratings accurate given the evidence? Evaluation tools

17 During Evaluation Question 1 To what extent do local programs have systems in place that support high quality implementation of the COS process?

18 Local Program Child Outcomes Infrastructure Checklist Local systems matter and can support quality data and outcomes… Which quality elements are not yet, partly, or fully in place across component area? Program administrators and teams of stakeholders consider current status, prioritize opportunities for growth, and document changes over time.

19 To what extent do local teams implement the COS according to policies and procedures? During Evaluation Question 2

20 WI PD to Support Quality Data Quality Data Data Review Assessment and the Rating Process Child Outcomes Birth to Six PD

21 TA to Support LEA’s with Identified Need in Assessment Child Outcomes – Getting Systems in Place from Entry to Exit Integration into the IEP/IFSP Process Revisit Purpose of the Indicator Overview 5 Purposes of Assessment Guiding Principles of Authentic / Ongoing Assessment Comparison Assessment Tools Practice with Assessment Tools to Practice Age-Anchoring Introduced to Child Outcomes Continuum

22 Integration into the IEP / IFSP Process

23 Continuum of Practices

24 Continuum of Practices - Sections Functional Ongoing AssessmentRating PracticesInternal Monitoring System & Data ReportingData Analysis

25 Levels of Practices System integration Core Competencies Exemplary Practices Expected of all districts Leads to accurate, meaningful data Expected Practices Partially in place Some enhancements needed Developmental Practices Not good practices Lead to inaccurate data Unacceptable Practices

26 How well do individuals/teams implement high quality practices, including team collaborations? During Evaluation Question 3

27 COS-TC Toolkit Checklist Designed to assist in improving COS quality practices, including with team collaboration and engaging families in the COS process. Built around a checklist of quality practices Identify, observe, and assess quality practices that occur as part of the COS process http://olms.cte.jhu.edu//olms2/COSTC

28 To what extent are the COS ratings accurate given the evidence? During Evaluation Question 4

29 Building a case that supports accuracy… Why to avoid the classic “interrater reliability” approach… Examples of evidence…… Training records show professionals have received training Surveys show they have knowledge about rating criteria Tools (e.g., COS-TC) show the process is being implemented well Record review/monitoring suggests ratings are reasonable for case and documented process is appropriate Data pattern checking shows meaningful patterns & few red flags And the newest tool…. Providers demonstrate ability to apply understanding in the beta version of the “Competency Check”

30 After Implementation of the COS process Components to consider – Data cleaning and analysis – To what extent do local programs review, interpret, and use COS data in meaningful ways? Example evaluation questions 1.To what extent do staff adhere to established data entry policies and procedures? 2.To what extent are the COS ratings consistent with selected predicted patterns (i.e., few red flags)? 3.(Starting point) To what extent do staff access and review available child outcomes data reports? Evaluation tools

31 Do staff adhere to established data entry procedures? After Evaluation Question 1

32 Evaluation tools Data quality checks Data system checks (i.e. error reports)

33 Are the COS ratings consistent with selected predicted patterns? After Evaluation Question 2

34 Evaluation tools Pattern checking table COS Data Analysis Guide

35 Wisconsin Data Reviews 2011-2013 Simple Pattern Checking Red Flags in Patterns 2013-2015 More Involved Pattern Checking Added Progress Categories Added how to work with Excel and Progress Calculator to Dig Deeper 2015-2016 Preparation for New Child Outcomes Application Targeted Districts by Looking at Red Flags & Outliers with Regional TA Focused on LEA’s Not Using Age-Anchoring Tools Consistently

36 Do staff access available COS data reports? After Evaluation Question 3

37 Evaluation tools Provider survey – Reports are accessible – Staff understand how to use the information in the reports to answer accountability and program questions – Staff analyze data and develop plans for program improvement

38 Activity 1.Consider a question you are interested in answering 2.Sit with others who are also interested. 3.Review the tool and talk in your group about possible applications to your efforts to evaluate your child outcomes data quality 4. Come back to share some a-ha’s with the larger group

39 Groups for Activity 1.Do providers have adequate training on COS process and COS policies and procedures? 2.Do providers have adequate knowledge of quality practices for conducting the COS and…To what extent to local teams implement the COS according to policies and procedures? 3.Are the COS ratings accurate given the evidence? 4.Are the COS ratings consistent with selected predicted patterns?

40 Report Out and Questions

41 Wrap Up

42 Contact Information Katrina Martin, katrina.martin@sri.com Kellen Reid, kellen.reid@unc.edu Kate Rogers, kate.rogers@vermont.gov Lauren Barton, lauren.barton@sri.com Ruth Chvojicek, chvojickr@cesa5.org Kathi Gillaspy, kathi.gillaspy@unc.edu


Download ppt "Evaluating activities intended to improve the quality of Child Outcomes Data August 2016."

Similar presentations


Ads by Google