Presentation is loading. Please wait.

Presentation is loading. Please wait.

Results Not Demonstrated AKA National National Picture.

Similar presentations


Presentation on theme: "Results Not Demonstrated AKA National National Picture."— Presentation transcript:

1

2 Results Not Demonstrated AKA

3 National National Picture

4 Feb 2008 SPP and APR reviewFeb 2008 SPP and APR review –Child Outcomes - Indicators C3 & B7 –Family Outcomes - Indicator C4 Highlights from...

5 Approach Part C (56 states) Preschool (59 states) One tool statewide 8/56 (14%) 13/59 (22%) Multiple Publishers’ online tools 2/56 (4%) 3/59 (5%) COSF 7 pt. scale 40/56 (71%) 36/59 (61%) Other 6/56 (11%) 7/59 (12%) State Approaches to Measuring Child Outcomes

6 All approaches have challenges... ApproachChallenges One tool statewide Defining age expectationsDefining age expectations Determining cutoffs for enough progress to be considered a change in growth trajectoryDetermining cutoffs for enough progress to be considered a change in growth trajectory

7 All approaches have challenges... ApproachChallenges Publishers’ analysis of on-line assessment tools Aligning assessment tool items with the 3 outcomesAligning assessment tool items with the 3 outcomes Programming the analysis to be comparable to other measurement approachesProgramming the analysis to be comparable to other measurement approaches

8 All approaches have challenges... ApproachChallenges Child Outcome Summary Form Getting consistency of interpretation and useGetting consistency of interpretation and use Requires understanding of child developmentRequires understanding of child development

9 Number of Children Included in Feb ‘08 SPP/APR Data Part C (52) Range: <30 = = = = = 3 Preschool (53) Range: <30 = = = = = 11

10

11

12

13 Part C- Trends across the 3 Outcomes

14 Part C Trends including states with N>30

15 Preschool – Trends across the 3 Outcomes (53 out of 60 States)

16 Preschool Trends including States with N>30

17 Assessment Tool Trends Part CPart C –HELP –BDI-2 –AEPS –Carolina –ELAP PreschoolPreschool –Creative Curr –BDI-2 –Brigance –AEPS –High Scope –WSS

18 Populations Included Part CPart C –40 States statewide –6 phasing in –6 sampling PreschoolPreschool –23 States statewide –14 phasing in –6 sampling –5 included children in other EC programs

19 Definitions of “near entry” Part CPart C –Variety of starting points- initial IFSP the most common reference point –Earliest: as part of intake, or with eligibility determination –Latest: w/in 6 months of enrollment PreschoolPreschool –Wide variation –From 30 days to 4 months from entry –States using ongoing assessments used “fall” data point for entry

20 Part CPart C –About half defined near exit –Typically within 30 to 60 days of exit PreschoolPreschool –About two thirds provided definition –Ranged from 30 days to 6 months –Included spring assessment points and “end of the school year” Definitions of “near exit”

21 Criteria for same aged peers COSF- 6 or 7 on the scale, by definitionCOSF- 6 or 7 on the scale, by definition Single tool statewide- variation in criteria across states; e.g. BDI 1.3 SD from mean for 2 states and 1.5 SD from mean for another stateSingle tool statewide- variation in criteria across states; e.g. BDI 1.3 SD from mean for 2 states and 1.5 SD from mean for another state Publishers analysis of data intended to correspond to COSF summary ratingsPublishers analysis of data intended to correspond to COSF summary ratings

22 Caution – Interpreting Data Only represents children who have entered and exited since outcome system put in place in statesOnly represents children who have entered and exited since outcome system put in place in states –In a typical state, data may represent children who participated in the program for 6 to 12 months The quality of data collection usually increases over time as guidance gets clearer and practice improves the implementationThe quality of data collection usually increases over time as guidance gets clearer and practice improves the implementation

23 Scanning Your Data for Unusual Patterns First, focus on progress categories “a” and “e”First, focus on progress categories “a” and “e” –Should reflect the characteristics of the children served in the state (e.g. eligibility definition in Part C) –Should be fairly stable over time (when data are high quality and representative)

24 Checking Category “a” Percents too high?Percents too high? –Should represent children with very significant delays or degenerative conditions (any improvement in functioning puts a child into “b”) Why it may be too highWhy it may be too high –Decision rules based on different interpretation of “no progress” –Tools without enough discrimination to show small amounts of progress

25 Checking Category “e” Percents too high or low?Percents too high or low? –Should represent children functioning at age expectations at entry and exit in each outcome area –Do your patterns make sense for each outcome based on the children served in the state? Why it may be too high or lowWhy it may be too high or low –Decision rules based on different interpretation of “age expectations –Decision rules based on different interpretation of “age expectations”

26 The validity of your data is questionable if… The n is too smallThe n is too small The overall pattern in the data looks ‘strange’:The overall pattern in the data looks ‘strange’: –Compared to what you expect –Compared to other data –Compared to similar states The data is not representative:The data is not representative: –Of all areas of the state –Of all kinds of families –Of all kinds of children

27 Improvement Activities Almost all states (Part C and 619) are conducting training and professional development:Almost all states (Part C and 619) are conducting training and professional development: –Assessment strategies –Data collection procedures –Data analysis and use –(and a little bit of) Practices to improve child outcomes

28 Improvement Activities Improving infrastructure for providing TA and supportImproving infrastructure for providing TA and support Conducting EvaluationConducting Evaluation –Reviewing data for accuracy and quality –Reviewing and revising processes –Identifying successes and challenges in the implementation of the outcomes system Improving data collection and reportingImproving data collection and reporting

29 Results Not Demonstrated

30 Family Outcomes

31 Part C Tools for Family Outcomes # (%) of states Assessment Tool 25 (46%) NCSEAM Family Survey 20 (37%) ECO Family Survey 6 (11%) State developed surveys 3 (6%) Added ECO items and/or NCSEAM items to their state survey 28 (52%) reported that they provided translations and/or translation services to assist families

32 Population # (%) of states Population 14 (26%) Sample 38 (72%) Census 1Combinaton

33 Variations in Target Populations # States Criteria: Families who Criteria: Families who 29 were enrolled in Part C and receiving services at least six months 10 were enrolled in Part C at the time of the survey, or during a specific time period 6 had exited the program and had participated in services for at least six months 3 had exited the Part C program during a specified period 1 received services 9 or more months 5 states did not report on criteria for the population

34 Representativeness # States Variables 35Reported on representative 27Race/ ethnicity 16Region/ geography 10Age 10Gender 3Time in services 3Prim disability/ eligibility 1Medicaid elig 1Prim language 1Program size

35 Timeframes for Data Collection # (%) of states Timeframe Timeframe 29 (56%) annually at a designated month or during a specific time period 13 (25%) according to a schedule based on an individual child’s participation in the Part C program 8 (15%) other (e.g. based the timing of family surveys on monitoring calendars, a specific date for data collection for the current reporting period only,

36 Part C has helped family: a. Know their Rights b. Communicate child’s needs c. Help child develop and learn Mean80%81%87% Range46-97%49-99%56-98% States made Progress 62%60%58% Overall Trends for Part C Family Outcomes

37 Response Rate Variation 32%Sampled 30%Census 52% State developed 31%ECO 27%NCSEAM 40% hand delivered 36% combined distribution methods 32% telephone surveys 23% mailed surveys

38 Trends in Improvement Activities Clarifying and developing policies and procedures (40 states)Clarifying and developing policies and procedures (40 states) –clarification of policies regarding family rights and family centered services –modifications to procedures related to the implementation of family surveys Providing training and professional support (28 states)Providing training and professional support (28 states) –to providers and service coordinators regarding family rights and procedural safeguards –effective practices relating to family centered services – understanding the procedures for implementing the measurement of family outcomes – and understanding and using the family survey data for program improvement

39 Trends in Improvement Activities Conducting evaluation (27 states)Conducting evaluation (27 states) –evaluating the processes used to implement family outcome measurement in FFY 2005 (including distribution methods, follow-up, methods of analysis –family focus groups or random interviews with families to validate outcomes data Improving data collection and reporting (25 states)Improving data collection and reporting (25 states) –developing strategies for improving the family survey response rates and representativeness of the data

40 Tools used for B8: Preschool Parent Involvement # (%) states Assessment Tool Assessment Tool 21 (35%) NCSEAM school-age survey 18 (30%) 6 State developed survey Included NCSEAM items 11 (18%) Modified/ customized NCSEAM survey 9 (15%) NCSEAM school-age survey and preschool survey 1 (2%) ECO Preschool Family Outcome survey

41 Results Not Demonstrated

42 Themes and Agenda

43 Preparing for the Future Setting TargetsSetting Targets Improving Data QualityImproving Data Quality –Training & TA Capacity –Written policies and procedures –Analysis and interpretation of the data –Quality Assurance / Monitoring Improvement Planning – for better data collection and for improved outcomesImprovement Planning – for better data collection and for improved outcomes

44 Child Outcomes

45 Child Outcomes Family Outcomes

46 Child Outcomes Family Outcomes Teacher/Provider Skills

47 Child Outcomes Family Outcomes Teacher/Provider Skills Program/Classroom Quality

48 Child Outcomes Family Outcomes Teacher/Provider Skills Program/Classroom Quality System

49 Themes of Agenda Sessions Quality AssuranceQuality Assurance –Quality assessment data –Reliable use of tools –Quality of analysis and reporting Training and TA (to address quality)Training and TA (to address quality) CollaborationCollaboration –Part C and 619 Preschool –Across Early Care and Education

50 Themes of Agenda Sessions Challenges of particular approachesChallenges of particular approaches –Decision rules for “age expectations” and progress category assignment for states using one tool statewide –Consistent interpretation and use of the COSF Outcomes from the local and family perspectivesOutcomes from the local and family perspectives

51 Themes of Agenda Sessions Building outcomes into monitoring and accountability systemsBuilding outcomes into monitoring and accountability systems Sampling issues and strategiesSampling issues and strategies Family outcomesFamily outcomes –Using data for improving family services and supports –Return rates and representative data


Download ppt "Results Not Demonstrated AKA National National Picture."

Similar presentations


Ads by Google