Presentation is loading. Please wait.

Presentation is loading. Please wait.

Strategies for Increasing Data Quality

Similar presentations


Presentation on theme: "Strategies for Increasing Data Quality"— Presentation transcript:

1 Strategies for Increasing Data Quality
Kate Rogers, Early Learning Team Manager Katie McCarthy, VT 619 Coordinator

2 Vermont Early Childhood Systems
State Policy and Grant Initiatives Universal PreK ECSE Services ECO Vermont Early Childhood Systems State level interagency work Universal PreK 77% of children with disabilities served in inclusive EC settings RTT-ELC Early Childhood Outcomes (COS) process embedded into the IEP 2013

3 Where We are Coming From: State Context
Universal PreK Full Implementation Fall 2016 For all 3, 4 and 5 year olds not enrolled in K Prequalified Public and Private EC Programs (including Head Start) 77% of children with disabilities served in inclusive EC settings QRIS (STARS) State PreK Inclusion Coordinator RTT-ELC Early MTSS, PreK Assessment, QRIS, Vermont Early Learning Standards (birth to 8), PreK Monitoring System, SLDS, PreK-Grade 3 Early Childhood Outcomes (COS) process embedded into the IEP 2013

4 Why did we prioritize child outcomes?
2013 Embedded ECO into IEP process Compiled anecdotal comments and questions about ECO from field Clear that teachers and staff did not see purpose, use and value Consequently we received data of poor quality For example, red flags popped for impossible combinations in COS calculator Universal PreK preparation for full implementation for each and every child 2015 Joined ECTA/DASY cohort for TA Heard from providers that they were not invested in collecting the data. They did not see the value in the process of the data. Frequently saw impossible combinations when the COS data were entered into the COS calculator. Were expanding the teams serving CWD so needed to be able to communicate responsibilities to Pre K providers

5 How VT Identified Root Causes of Data Quality Issues
Disseminated statewide VT 619 Implementation Survey (available on the conference website) “The Survey Said…” Teachers and staff were not engaged teaming practices The level of family engagement was minimal Need for technical assistance and professional learning

6 How are we improving teaming practices?
Personnel/Workforce Cross agency (AOE and AHS) professional learning coordination Training modules and packets ready to launch Fall 2017 (list available on conference site) Governance Developed a ECO Practice and Procedure Manual Universal PreK regulations include rules to ensure inclusion Universal PreK regulations include access to service delivery model PreK Monitoring System incorporates ECSE Guiding Principals for Full Participation Accountability Exploring Local Child Outcomes Measurement Framework (LCOMS; available on the conference website) as part of Pre K monitoring system

7 Early Childhood Outcomes Practices and Procedures Training and TA
WHO are the learners? ECSE Educators, Related Service Personnel PreK Educators (private and public) Administrators WHAT is the content? P and P Manual Topics such as Inclusion and ADA 504 HOW is the content delivered? State and Local Strategies Training and TA Modules State Early Childhood website Face to Face 76.37 % of children in inclusive settings 1769 children total 1351 are in inclusive settings

8 Evaluating the impact of the system improvements on teaming
Pre- Post- implementation survey COS-TC Quality Practices Checklist and Descriptions (Younggren, Barton, Jackson, Swett, & Smyth, 2017). VT –LCOMS (installation phase) Proof is in the ECO data Pudding!

9 Next steps Quality Standards Data System
Revising the QRIS system to reflect high quality practices for children with disabilities including teaming and assessment Data System AOE is exploring ‘real time’ ECO reports for local and state. Planning ECO Data Days (DAZE!) for LEAs

10 WA Part C Early Intervention Services
Debi Donelan, Assistant Administrator of Training and Technical Assistance Susan Franck, Part C Data Manager

11 Where we are coming from: State Context
Housed within a department of early learning Currently have 25 local lead agencies (LLAs) System design work to address infrastructure needs, including priority areas: Regionalization, Resources, Rules, and Robust data system SSIP focused on increasing social emotional outcomes Have an online IFSP with real time reporting available to multiple user types. COS embedded into IFSP

12 Why did we prioritize child outcomes?
SSIP data analysis identified the following: Unexpected patterns related to entry ratings Inconsistent understanding of the Child Outcome Summary (COS) process Inconsistent involvement of families in the COS process Needed better measurement to understand the impact of SSIP improvement activities Had data reports available through our web based system that locals were not using Did not have a structure in place to support local data use SSIP data analysis identified high entry ratings in social-emotional outcome, in-depth data analysis (local team interviews) identified inconsistent COS process, especially related to family involvement in the process

13 Identifying root causes for data quality issues
Completed extensive analysis of existing child outcomes data patterns including: Entry Exit Age at entry Length of time in service Race/ethnicity Gender Disability category Service area

14 How are we improving local data use in WA
Data System Planning a revision to the data system to improve the quality of the data collected. Personnel/Workforce Developed a series of COS modules and required all providers to complete an assessment of their knowledge after completing the modules. Completed “Fun with Data” activities at local lead agency meetings. Focusing quarterly calls with local lead agency administrators on the use of child outcomes data. Governance System design plan aligns authority to ensure all early intervention providers complete required trainings Accountability and Quality Improvement Requiring local lead agencies to complete the Local Child Outcomes Measurement Framework as part of their self-assessment.

15 Developed an evaluation to see where each LLA was related to:
SLA LLA Providers Finding Reports Understanding Data Analyzing Data Monitoring Data Using Data for Program Improvement Families Developed an evaluation to see where each LLA was related to: Finding reports, Understanding reports, Using reports to analyze data, Using reports to monitor data, and Using reports to assess progress and make program adjustments.

16 Fun with Data- activities at LLA meetings
November Fun with Data Support LLAs to start looking at data and identifying patterns and hypothesize contributing factors Small groups compared de-identified local data in comparison to statewide data November Fun with Data 2.0 Support LLAs to review their own data in comparison to statewide data Reviewed entry by exit data using pattern checking tool FUN WITH DATA: At our outcomes cohort face-to-face meeting in November, 2015, we examined local data patterns in child outcomes: Outcome 1 (social-emotional) Observed high percentage of ratings of 6 or 7 at entry We reviewed data that was disaggregated by Local Lead Agency and identified programs where this was more of a concern Social-emotional is the focus of our SSIP work We developed an activity to complete with Local Lead Agencies later that month Purpose to support LLAs to start looking at data and identifying patterns We shared sample entry scores from two de-identified programs (labeled program A and B) in comparison to statewide data. We asked participants to identify a pattern in the local data that differed from the statewide data. (Program A had a very high percentage of a COS entry score of 3, Program B had a very high percentage of a COS entry score of 6.) Small groups hypothesized which contributing factors might have influenced the data, using the local contributing factors tool, which includes: Policies and Procedures, Infrastructure, Data Quality, Training/Technical Assistance, Supervision, Child Outcome Summary Rating Process This activity was our first step in building data analysis into our LLA meetings. This November we completed the second phase of the activity (Fun with Data 2.0) using their local data in comparison to statewide data. The goal was to continue building capacity at the local level. We provided each LLA with their own data and reviewed the Outcome 1 entry by exit report, which showed number and percentages of exit scores in relation to entry scores. We shared the pattern checking tool and described the predicted pattern: functioning at entry in one outcome area will be related to functioning at exit in the same outcome area

17 Fun with Data- activities at LLA meetings

18 Quarterly call topics October 2016 January 2017 April 2017
Reviewed quarterly call process Provided orientation to all COS reports Reviewed COS reports Progress codes and summary statements Entry by Exit report Progress codes by Ethnicity January 2017 Developed resource: LLAs demonstrated understanding of the COS process and reports Data activity- LLAs charted their progress category percentages for each outcome for the year comparable to these data for the state April 2017 Reviewed COS resource: guiding questions and activity template Data activity: walked through the guiding questions with COS reports

19 Evaluating the impact of the system improvements
93% of providers completed the COS modules and passed the quiz. Quarterly assessment of the participants’ comfort with data. The average ability to access reports score increased from 3.5 during call one in October to 4.2 during call two in January. Challenges/lessons learned: Identifying a learning progression for data skills. There can be multiple people on a call who have different levels of comfort with data. Difficult to figure out how to estimate for the program. Progress has been made toward this outcome.

20 Next steps Support locals in replicating data activities with their providers. Revise the data management system to streamline reporting and better meet the needs of the local administrators.

21 Discussion Questions What steps are you taking to improve child outcomes data quality? How are stakeholders included in child outcomes data quality efforts? Is this work part of your SSIP improvement strategies? Training and adult learning principles – are states interested in doing this and what would their action steps be to work with their locals? Do you have resources available to build capacity at local level? What can be the one step you would take? What are your action steps to improve child outcome data quality leaving this session?

22 Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P and #H373Z However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile.


Download ppt "Strategies for Increasing Data Quality"

Similar presentations


Ads by Google