Integrating Outcomes Learning Community Call February 8, 2012

Slides:



Advertisements
Similar presentations
Early Childhood Outcomes Center1 Involving Families.
Advertisements

Data, Now What? Skills for Analyzing and Interpreting Data
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Using data for program improvement Early Childhood Outcomes Center1.
Updates on APR Reporting for Early Childhood Outcomes (Indicators C-3 and B-7) Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
1 Call in number or Improving the Quality of Child Outcome Data Materials at
Measuring Child Outcomes Christina Kasprzak Robin Rooney (ECO) Early Childhood Outcomes (NECTAC) National Early Childhood TA Center Delaware COSF Training,
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Child Outcomes Measurement Tools & Process A story of 3 conversions.
Looking at Data Presented by The Early Childhood Outcomes Center
EIA: Using data for program improvement
Quality Assurance: Looking for Quality Data
Child Outcomes Summary Process April 26, 2017
Child Outcomes Summary (COS) Process Training Module
OSEP Project Directors Meeting
Child Outcomes Summary (COS) Process Module
Welcome to the National ECO TA Call Improving the Quality of
the Child Outcomes Data Workshop!
Kathy Hebbeler, ECO at SRI International AUCD Meeting Washington, DC
Data Workshop: Analyzing and Interpreting Data
Early Childhood Outcomes Data (Indicator C3 and B7)
Assuring the Quality of your COSF Data
Child Outcomes Summary (COS) Process Module
Building State Systems to Produce Quality Data on Child Outcomes
Webinar for the Massachusetts ICC Retreat October 3, 2012
Using outcomes data for program improvement
Child Outcomes Summary (COS) Process Module
Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center
Why Collect Outcome Data?
The Basics of Quality Data and Target Setting
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Data Workshop: Analyzing and Interpreting Data
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Integrating Outcomes Learning Community June 12, 2013
Measuring EC Outcomes DEC Conference Presentation 2010
Child Outcomes Summary (COS) Process Module
Child Outcomes Summary (COS) Process Training Module
Measuring Part C and Early Childhood Special Education Child Outcomes
Refresher: Background on Federal and State Requirements
Child Outcomes Data July 1, 2008 – June 30, 2009
Involving Families Early Childhood Outcomes Center.
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Assuring the Quality of your COSF Data
Presentation transcript:

Integrating Outcomes Learning Community Call February 8, 2012 Improving and Interpreting Child Outcomes Data

Objectives for the call To discuss strategies and resources for improving child outcomes data quality To focus on looking at data through pattern checking as a mechanism for improving data quality Early Childhood Outcomes Center

Quality Assurance: Looking for Quality Data I know it is in here somewhere Early Childhood Outcomes Center

Ongoing checks for data quality Before Good training and assessment Efficient data systems Timely and accurate data entry During Ongoing supervision of implementation Feedback to implementers Refresher training After Review sample of completed COSFs Pattern checking analysis Early Childhood Outcomes Center

Quality checks before data collection begins Good training and assessment Efficient data systems Timely and accurate data entry Early Childhood Outcomes Center

Promoting Quality Data Through training and communication (before and during data collection) related to: Assessment Understanding the COSF process Age expectations Data entry Early Childhood Outcomes Center

Promoting Quality Data Through data systems and verification, such as Data system error checks Good data entry procedures Early Childhood Outcomes Center

Quality checks during data collection Ongoing supervision of implementation Feedback to implementers Refresher training Early Childhood Outcomes Center

e.g. Quality review of COS team discussion Ongoing supervision Review of the process Is the process high quality? Are teams reaching the correct rating? Methods Observation Videos e.g. Quality review of COS team discussion Early Childhood Outcomes Center

Ongoing Supervision Feedback to teams is critical Refresher training Beware of Auto pilot Drift Early Childhood Outcomes Center

Quality review through process checks Provider surveys Self assessment of competence Knowledge checks Process descriptions (who participates?) Identification of barriers Early Childhood Outcomes Center

Quality checks after data collection Review sample of completed COSFs Pattern checking analysis Early Childhood Outcomes Center

Quality Review of Completed COS Forms adequate evidence? match the outcome area? based on functional behaviors? across settings and situations? ratings consistent with evidence? Early Childhood Outcomes Center

Pattern Checking Analysis The quality of the child outcomes data are established by a series of analyses that demonstrate the data are showing predictable patterns. Pattern Checking Table http://www.fpg.unc.edu/~eco/assets/pdfs/Pattern_Checking_Table.pdf

Ongoing checks for data quality - Poll Before Good training and assessment Efficient data systems Timely and accurate data entry During Ongoing supervision of implementation Feedback to implementers Refresher training After Review sample of completed COSFs Pattern checking analysis Early Childhood Outcomes Center

Take Home Message If you conclude the data are not (yet) valid, they cannot be used for program effectiveness, program improvement or anything else. What do you if the data are not as good as they should be? Answer: Continue to improve data collection through ongoing quality assurance Early Childhood Outcomes Center

Looking at Data Early Childhood Outcomes Center

Using data for program improvement = EIA Evidence Inference Action Early Childhood Outcomes Center

Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable Early Childhood Outcomes Center

Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data Early Childhood Outcomes Center 22

Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Again, early on the action might have to do with improving the quality of the data Early Childhood Outcomes Center

Promoting quality data through data analysis 23

We have expectations about how child outcomes data should look Checking to see if ratings accurately reflect child status: Pattern checking We have expectations about how child outcomes data should look Compared to what we expect Compared to other data in the state Compared to similar states/regions/school districts When the data are different than expected ask follow up questions Early Childhood Outcomes Center

Questions to ask Do the data make sense? Am I surprised? Do I believe the data? Believe some of the data? All of the data? If the data are reasonable (or when they become reasonable), what might they tell us? Early Childhood Outcomes Center

Let’s look at some data … Early Childhood Outcomes Center 26

Remember: Child Outcomes (see reference sheet) 1. Positive social-emotional skills (including social relationships); 2. Acquisition and use of knowledge and skills (including early language/communication [and early literacy]); and 3. Use of appropriate behaviors to meet their needs Early Childhood Outcomes Center

Remember: COSF 7-point scale (see reference sheet) 7-Completely- Age appropriate functioning in all or almost all everyday situations; no concerns 6- Age appropriate functioning, some significant concerns 5-Somewhat- Age appropriate functioning some of the time and/or in some settings and situations 4- Occasional age-appropriate functioning across settings and situations; more functioning is not age-appropriate than age appropriate. 3-Nearly- Not yet age appropriate functioning; immediate foundational skills most or all of the time 2- Occasional use of immediate foundational skills 1-Not yet- Not yet age appropriate functioning or immediate foundational skills Early Childhood Outcomes Center

Remember: Reporting Categories (see reference sheet) Percentage of children who: a. Did not improve functioning b. Improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers c. Improved functioning to a level nearer to same-aged peers but did not reach it d. Improved functioning to reach a level comparable to same-aged peers e. Maintained functioning at a level comparable to same-aged peers 3 outcomes x 5 “measures” = 15 numbers Early Childhood Outcomes Center

Remember: Summary Statements (see cheat sheet) Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program. c + d___ a + b + c + d Early Childhood Outcomes Center

Remember: Summary Statements (see cheat sheet) The percent of children who were functioning within age expectations in each Outcome by the time they exited the program. d + e__ a + b + c + d + e Early Childhood Outcomes Center

National Data Early Childhood Outcomes Center

Note: Based on 33 States with highest quality data

Early Childhood Outcomes Center Note: Based on 33 States with highest quality data Early Childhood Outcomes Center

Early Childhood Outcomes Center Note: Based on 29 States with highest quality data Early Childhood Outcomes Center

Early Childhood Outcomes Center Note: Based on 29 States with highest quality data Early Childhood Outcomes Center

Sample State Data Early Childhood Outcomes Center

Have a good outcome measurement Key to Good Data Have a good outcome measurement SYSTEM Early Childhood Outcomes Center

Quality Checks Missing Data Pattern Checking Full ECO Pattern Checking Table: http://www.fpg.unc.edu/~eco/assets/pdfs/Pattern_Checking_Table.pdf Early Childhood Outcomes Center

Missing Data - Overall How many children should the state be reporting to OSEP in the SPP/APR table? i.e., how many children exited in the year, and stayed in the program 6 months? Do you have a way to know? What percentage of those children do you have in the table? Are you missing data selectively? (by local program, by child or family characteristic?) Early Childhood Outcomes Center

Pattern Checking The quality of the child outcomes data are established by a series of analyses that demonstrate the data are showing predictable patterns: Across Outcomes Across time Compared with disability information Early Childhood Outcomes Center

Information available for pattern checking OSEP Progress Categories Entry Data Exit Data Summary Statements Early Childhood Outcomes Center

Predicted Pattern #1 1a. Children will differ from one another in their entry scores in reasonable ways (e.g., fewer scores at the high and low ends of the distribution, more scores in the middle). . 1b. Children will differ from one another in their exit scores in reasonable ways. 1c. Children will differ from one another in their OSEP progress categories in reasonable ways. Early Childhood Outcomes Center

Rationale Evidence suggests EI and ECSE serve more mildly than severely impaired children (e.g., few ratings/scores at lowest end). Few children receiving services would be expected to be considered as functioning typically (few ratings/scores in the typical range). Early Childhood Outcomes Center

Predicted Pattern #1 (cont’d) Analysis Look at the distribution of rating/scores at entry and exit and the data reported to OSEP. Look at the percentage of children who scored as age appropriate (or not) on all three outcomes at entry and at exit. Question: Is the distribution sensible? What do you expect to see? Early Childhood Outcomes Center

Exit Data Early Childhood Outcomes Center

Poll Do you pull child outcomes data by the 1-7 ratings on the scale? What have you learned? Early Childhood Outcomes Center

Hypothetical State Data: Outcome 1 Exit Data Early Childhood Outcomes Center

Hypothetical State Data: Outcome 2 Exit Data Early Childhood Outcomes Center

Hypothetical State Data: Outcome 3 Exit Data Early Childhood Outcomes Center

Poll Do you look at children exiting the program at age expectations across all three outcomes (6 & 7)? What have you learned? Early Childhood Outcomes Center

Question? What would you anticipate as the percentage of children who leave your program and no longer need special education services? Early Childhood Outcomes Center

‘At Age’ across all three outcomes: OSEP progress categories Percent of children that scored a 6 or 7 on all three outcomes at entry %xx (n = xx) Percent of children that scored a 6 or 7 on all three outcomes at exit Early Childhood Outcomes Center

Hypothetical State Data: Across all 3 outcomes Early Childhood Outcomes Center

Poll Do you pull child outcomes data by OSEP progress categories? Are you comparing your data to national or other states’ data? What have you learned? Early Childhood Outcomes Center

OSEP Categories Early Childhood Outcomes Center

Hypothetical State Data: Outcome 1 OSEP Categories Early Childhood Outcomes Center

Hypothetical State Data: Outcome 2 OSEP Categories Early Childhood Outcomes Center

Hypothetical State Data: Outcome 3 OSEP Categories Early Childhood Outcomes Center

Predicted Pattern #4b 4b. Children will not show huge changes between entry and exit. Analyses: Comparison of entry and exit scores (exit score minus entry score) Question: What do we expect to see? Early Childhood Outcomes Center

Poll Are you pulling your data to look at the number of children who increase by 4 or more points on the 7 point scale? What have you learned? Early Childhood Outcomes Center

Outcome 1: Children that increased by 4 or more points from entry to exit Early Childhood Outcomes Center 62

Outcome 2: Children that increased by 4 or more points from entry to exit Early Childhood Outcomes Center 63

Outcome 3: Children that increased by 4 or more points from entry to exit Early Childhood Outcomes Center 64

Wrap-up Early Childhood Outcomes Center

Drilling down: Looking at data by local program All analyses that can be run with the state data can be run with the local data The same patterns should hold and the same predictions apply. Need to be careful about the size of N with small programs. Early Childhood Outcomes Center

Are your data high quality? Are the missing data less than 5%? Do your state’s data support the predicted patterns? If not, where are the problems? What do you know or can you find out about why they are occurring? Early Childhood Outcomes Center

Questions to ask Do the data make sense? Am I surprised? Do I believe the data? Believe some of the data? All of the data? If the data are reasonable (or when they become reasonable), what might they tell us? Early Childhood Outcomes Center 68

Discussion Early Childhood Outcomes Center 69

Find more resources at: http://www. the-eco-center-org Early Childhood Outcomes Center