Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,

Slides:



Advertisements
Similar presentations
High Quality Child Outcomes Data in Early Childhood: More Important than Ever Kathleen Hebbeler, SRI International Christina Kasprzak, Frank Porter Graham.
Advertisements

Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
Data, Now What? Skills for Analyzing and Interpreting Data
Results Not Demonstrated AKA National National Picture.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Refresher: Background on Federal and State Requirements.
Presented at: Annual Conference of the American Evaluation Association Anaheim, CA - November 3, 2011 Performance Management in Action: A National System.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Lynne Kahn Christina Kasprzak Kathy Hebbeler The Early Childhood Outcomes.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
State Activities in Measuring Child Outcomes Lynne Kahn, Donna Spiker, Melissa Raspa, & Kathleen Hebbeler ECO Center Presented at: International Society.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
Update on Part C Child Outcomes Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center June 2011 Kathy Hebbeler ECO at SRI International.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Using data for program improvement Early Childhood Outcomes Center1.
Child Outcomes Data July 1, 2008 – June 30, 2009.
Considerations for Establishing Baseline and Setting Targets for Indicators C3 and B7 Kathy Hebbeler, Lynne Kahn, Christina Kasprzak ECO/NECTAC June 16,
1 Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4) Analysis and Summary Report of All States’ Annual Performance Reports Christina.
The Center for IDEA Early Childhood Data Systems What Practitioners Need to Know about Measuring EI and ECSE Outcomes Kathleen Hebbeler, SRI International.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG Christina Kasprzak, ECO at FPG Cornelia Taylor, ECO at SRI Lauren Barton, ECO at SRI National Picture.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Provider Perceptions of the Child Outcomes Summary Process Lauren Barton and Cornelia Taylor October 27, 2012 Measuring and Improving Child and Family.
Patterns in Child Outcomes Summary Data: Cornelia Taylor, Lauren Barton, Donna Spiker September 19-21, 2011 Measuring and Improving Child and Family Outcomes.
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
PREVIEW: STATE CHILD OUTCOMES DATA QUALITY PROFILES National Webinar February 2014.
2012 OSEP Leadership Conference Leading Together to Achieve Success from Cradle to Career Child Outcomes for Early Intervention and Preschool Special Education:
National Picture – Child Outcomes for Early Intervention and Preschool Special Education Kathleen Hebbeler Abby Winer Cornelia Taylor August 26, 2014.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Kathy Hebbeler Lynne Kahn The Early Childhood Outcomes (ECO) Center.
Critical Markers of High Quality Child Outcomes Data ECO Advisory Board March, 2012.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Parent and National TA Perspectives on EC Outcomes Connie Hawkins, Region 2 PTAC Kathy Hebbeler, ECO at SRI Lynne Kahn ECO at FPG and NECTAC.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Kathy Hebbeler, ECO at SRI Lynne Kahn, NECTAC and ECO at FPG
A National Picture: Child Outcomes for FFY
OSEP Project Directors Meeting
OSEP Project Directors Meeting
Kathy Hebbeler, ECO at SRI International AUCD Meeting Washington, DC
Child Outcomes Data: A Critical Lever for Systems Change
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Early Childhood Outcomes Data (Indicator C3 and B7)
Integrating Outcomes Learning Community Call February 8, 2012
IDEA Part C and Part B Section 619 National Child Outcomes Results for
OSEP Initiatives on Early Childhood Outcomes
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Webinar for the Massachusetts ICC Retreat October 3, 2012
Using outcomes data for program improvement
Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4)
Gathering Input for the Summary Statements
Integrating Outcomes Learning Community June 12, 2013
Measuring EC Outcomes DEC Conference Presentation 2010
Measuring Part C and Early Childhood Special Education Child Outcomes
Christina Kasprzak Frank Porter Graham Child Development Institute
Early Childhood Outcomes Data (Indicator C3 and B7)
Presentation transcript:

Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October, 2013

On Today’s Call Brief review of the national data Data quality –Completeness of data –State-to-state variation –Pattern checking with other variables –Change over time 2

3 Approach Part C (N=56) Preschool (N=59) COS 7 pt. scale42/56 (75%)37/59 (63%) One tool statewide 8/56 (14%)9/59 (15%) Publishers’ online analysis 1/56 (2%)6/59 (10%) Other5/56 (9%)7/59 (12%) State Approaches to Measuring Child Outcomes –

3 Methods Methods for Calculating National Estimates 1.All states averaged (each state weighted as 1) 2.All states weighted by child count 3.States with the highest quality data weighted by child count to represent all states* 4 * The data we will be presenting for the national picture.

Identifying States with Quality Data Criteria for high quality data: Reporting data on enough children –Part C – 28% or more of exiters –Preschool – 12% or more of child count Within expected patterns in the data –category ‘a’ not greater than 10% –category ‘e’ not greater than 65% 5

Number of States that Met Criteria for Inclusion in the National Analysis Part C Preschool

Part C – Reason States were Excluded from Analyses 7 Reason Part C state was excluded State is sampling 32 No outcomes data reported 00 Reported outcomes data on less than 28% of reported exiters 36 Had at least one outcome with category a greater than 10% or category e greater than 65% 45 Reported outcomes data on less than 28% of reported exiters AND Had at least one outcome with category a greater than 10% or category e greater than 65% 14 Questionable data quality based on review of SPP/APR and knowledge gained through TA 10 States included in the analysis3933

Part B Preschool – Reason States were Excluded from Analyses 8 Reason Part B state was excluded State is sampling 42 No progress category data reported 12 No outcomes data reported 10 Reported outcomes data on less than 12% of child count 24 Had at least one outcome with category a greater than 10% or category e greater than 65% 43 Reported outcomes data on less than 12% of child count AND Had at least one outcome with category a greater than 10% or category e greater than 65% 00 Questionable data quality based on review of SPP/APR and knowledge gained through TA 20 No child count data available 10 States included in the analysis 3639

9

10

11

12

Good News! Consistent data over time Increasing number of children in the child outcomes data Increasing number of states in the ‘quality’ data for child outcomes 13

Current Emphasis Data Quality –Increasing the number of children/families in the data –Pattern checking to identify data quality issues –Training, guidance, supervision, etc. 14

Part C: Percent of States by completeness of child outcomes data* * Completeness = (total with outcomes data/total exiters)

Part B 619 : Percent of States by completeness of child outcomes data* * Completeness = (total with outcomes data/child count)

State Level Variation Lots of variation across states in summary statement values Variation is not a direct result of percent served 17

Part C: Positive Social Emotional Skills Progress Category ‘b’ 18

Part B Preschool: Positive Social Emotional Skills Progress Category ‘b’ 19

Part C – Knowledge and Skills, State Percentages for Increased Rate of Growth,

Part B Preschool – Meets Needs, State Percentages for Exited within Age Expectations,

Pattern Checking 22

23

Part C, Average Percentage Who Exited Within Age Expectations by State Percent Served,

25

Part B 619 longitudinal patterns all states Outcome 1 Summary Statement 1

Part C longitudinal patterns all states Outcome 1 Summary Statement 1 27

Looking only at states that met the quality criteria for inclusion in the national analysis based on…. –Missing data criteria –Patterning criteria –APR/SPP reviews 28

Part B 619 longitudinal patterns best states from last 3 years Outcome 1 Summary Statement 1 (n=28)

Part B 619 longitudinal patterns best states all 4 years Outcome 1 Summary Statement 1 (12 states)

Discussion How should we interpret differences between state values? What pieces of information already reported would predict summary statement values? 31

Interpreting Change over time 32

What types of change are important Small variations from year to year are expected Large consistent increases are good news particularly when linked to programmatic changes Large consistent decreases require explanation (e.g. changing population) Large up and down changes are an indicator of questionable data quality and require explanation 33

Part C: Change between and Statistically Significant Change OC1-SS1OC2-SS1OC3-SS1OC1-SS2OC2-SS2OC3-SS2 Negative None Positive Average Positive change Min and Max 2 – 15 3 – 132 – 183 – 153 – 22

Part B Preschool: Change between and Statistically Significant Change OC1-SS1OC2-SS1OC3-SS1OC1-SS2OC2-SS2OC3-SS2 Negative None Positive Average Positive change Min and Max – –

Conclusions The data continue to be used by the federal government to justify funding Results Driven Accountability is shining a spotlight on each state’s child outcomes data. State’s can expect more scrutiny around data quality 36

How we can help! Cornelia for a state profile of your data quality Contact us for help with data quality analysis and quality assurance activities Contact us for help with program improvement planning and data analysis. 37

38

National Graphing Template 39

Other resources National Graphing templates – ummarygraphhttp:// ummarygraph Data quality TA resources – ce.asphttp:// ce.asp Data analysis for program improvement TA resources – 40

Upcoming family data webinar Stay tuned for an upcoming presentation of Family Data: Indicator C4Highlights 41

42 Find more resources at: