Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Analysis for Assuring the Quality of your COSF Data 1.

Similar presentations


Presentation on theme: "Data Analysis for Assuring the Quality of your COSF Data 1."— Presentation transcript:

1 Data Analysis for Assuring the Quality of your COSF Data 1

2 What are these numbers?? 2

3 OSEP reporting requirements: the outcomes Percentage of children who demonstrated improved: 1.Positive social emotional skills (including positive social relationships) 2.Acquisition and use of knowledge and skills (including early language/ communication [and early literacy]) 3.Use of appropriate behaviors to meet their needs 3

4 OSEP reporting categories Percentage of children who: a.Did not improve functioning b.Improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers c.Improved functioning to a level nearer to same-aged peers but did not reach it d.Improved functioning to reach a level comparable to same-aged peers e. Maintained functioning at a level comparable to same-aged peers 3 outcomes x 5 “measures” = 15 numbers 4

5 Getting to progress categories from the COSF ratings 5

6 FunctioningFunctioning 6

7 Entry 7

8 Exit 8

9 EntryExit 9

10 Key Point The OSEP categories describe types of progress children can make between entry and exit Two COSF ratings (entry and exit) are needed to calculate what OSEP category describes a child progress 10

11 How changes in ratings on the COSF correspond to reporting categories a - e e. % of children who maintain functioning at a level comparable to same-aged peers Rated 6 or 7 at entry; ANDRated 6 or 7 at entry; AND Rated 6 or 7 at exitRated 6 or 7 at exit 11

12 EntryExit 12

13 EntryExit 13

14 EntryExit 14

15 How changes in ratings on the COSF correspond to reporting categories a - e d. % of children who improve functioning to reach a level comparable to same-aged peers Rated 5 or lower at entry; ANDRated 5 or lower at entry; AND Rated 6 or 7 at exitRated 6 or 7 at exit 15

16 EntryExit 16

17 How changes in ratings on the COSF correspond to reporting categories a - e c. % of children who improved functioning to a level nearer to same aged peers, but did not reach it Rated higher at exit than entry; ANDRated higher at exit than entry; AND Rated 5 or below at exitRated 5 or below at exit 17

18 EntryExit 18

19 EntryExit 19

20 How changes in ratings on the COSF correspond to reporting categories a - e b. % of children who improved functioning, but not sufficient to move nearer to same aged peers Rated 5 or lower at entry; ANDRated 5 or lower at entry; AND Rated the same or lower at exit; ANDRated the same or lower at exit; AND “Yes” on the progress question (b)“Yes” on the progress question (b) 20

21 EntryExit 21

22 EntryExit 22

23 EntryExit 23

24 EntryExit 24

25 How changes in ratings on the COSF correspond to reporting categories a - e a. % of children who did not improve functioning Rated lower at exit than entry; ORRated lower at exit than entry; OR Rated 1 at both entry and exit; ANDRated 1 at both entry and exit; AND Scored “No” on the progress question (b)Scored “No” on the progress question (b) 25

26 EntryExit 26

27 EntryExit 27

28 The ECO Calculator can be used to translate COSF entry and exit ratings to the 5 progress categories for federal reporting 28

29 Promoting quality data through data analysis 29

30 Promoting quality data through data analysis Examine the data for inconsistencies If/when you find something strange, what might help explain it? Is the variation because of a program data? Or because of bad data? (at this point in the implementation process, data quality issues are likely!) 30

31 The validity of your data is questionable if… The overall pattern in the data looks ‘strange’ –Compared to what you expect –Compared to other data –Compared to similar states/regions/agencies 31

32 COSF Ratings – Outcome 1 Entry data (fake data) RatingStatewide 130 242 351 460 510 6 70 32

33 COSF Ratings – Outcome 1 Entry data (fake data) RatingStatewide 130 (15%) 242 (20%) 351 (25%) 460 (30%) 510 (5%) 6 70 (0%) 33

34 Frequency on Outcome 1 – Statewide (fake data) 34

35 COSF Ratings – Outcome 1 Entry data (fake data) RatingAgency 1Agency 2Agency 3Agency 4 13112 24122 35233 46324 51454 61554 70421 35

36 COSF Ratings – Outcome 1 Entry data (fake data) RatingGroup 1Group 2Group 3Group 4 115%5% 10% 220%5%10% 325%10%15% 430%15%10%20% 55%20%25%20% 65%25% 20% 70%20%10%5% 36

37 Questions to ask when looking at data Do the data make sense? –Am I surprised? –Do I believe the data? Some of it? All of it? If the data are reasonable (or when they become reasonable), what might they tell us? When we believe the data, how can we use it for program improvement? 37

38 Using data for program improvement 38

39 39 Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Continuous Program Improvement

40 Using data for program improvement = EIA E vidence I nference A ction 40

41 41 Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable

42 42 Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence)

43 43 Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data

44 Explaining variation Who has good outcomes = Do outcomes vary by Region of the state? Amount of services received? Type of services received? Age at entry to service? Level of functioning at entry? Family outcomes? Education level of parent? 44

45 45 Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Again, early on the action might have to do with improving the quality of the data

46 Working Assumptions There are some high quality services and programs being provided across the state There are some children who are not getting the highest quality services If we can find ways to improve those services/programs, these children will experience better outcomes 46

47 Questions to ask of your data Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? 47

48 Program improvement: Where and how –At the state level – TA, policy –At the agency level – supervision, guidance –Child level -- modify intervention 48

49 Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning 49

50 Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Is there a problem? Why is it happening? What should be done? Is it being done? Is it working? Tweaking the System 50

51 Continuous means… ….the cycle never ends.….the cycle never ends. 51


Download ppt "Data Analysis for Assuring the Quality of your COSF Data 1."

Similar presentations


Ads by Google