Presentation is loading. Please wait.

Presentation is loading. Please wait.

Looking at Data Presented by The Early Childhood Outcomes Center

Similar presentations


Presentation on theme: "Looking at Data Presented by The Early Childhood Outcomes Center"— Presentation transcript:

1 Looking at Data Presented by The Early Childhood Outcomes Center
Revised January 2013

2 Using data for program improvement = EIA
Evidence Inference Action Early Childhood Outcomes Center

3 Early Childhood Outcomes Center
Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable Early Childhood Outcomes Center

4 Early Childhood Outcomes Center
Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

5 Early Childhood Outcomes Center
Inference Inference is debatable -- even reasonable people can reach different conclusions from the same set of numbers Stakeholder involvement can be helpful in making sense of the evidence Early Childhood Outcomes Center

6 Early Childhood Outcomes Center
Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Early Childhood Outcomes Center

7 Early Childhood Outcomes Center
What can we infer? Poll results A: Candidate I.M. Good 51%, Candidate R.U. Kidding 49% (+ or – 3%) Poll results B: Candidate I.M. Good 56%, Candidate R.U. Kidding 44% (+ or – 3%) Early Childhood Outcomes Center

8 Program improvement: Where and how
At the state level – TA, policy At the regional or local level – supervision, guidance Classroom level -- spend more time on certain aspects of the curriculum Child level -- modify intervention Different program improvement levers at different levels. Going to be focusing primarily on the state level use of information. Some state applications translate directly to smaller units. How interventionists or teacher use outcome data for program improvement is a completely different topic – very important but we are not going to cover it here. Early Childhood Outcomes Center

9 Early Childhood Outcomes Center
Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning You cannot prevent the misuse of data but you can set up conditions to make it less likely. Early Childhood Outcomes Center

10 E – I – A Jeopardy $100 $100 $100 $200 $200 $200 $300 $300 $300
COS users unaware of the need to answer the yes/no progress question 90% of exit COSs in Program B missing a response to the yes/no progress question Revise COS procedures to emphasize completion of yes/no progress question Conduct staff development on using the 7-point rating scale 75% of children in Program A received entry ratings of 2 COS users misunderstand the definition of points on the 7-point scale Currently used tools are not accurately assessing children’s social emotional skills Invest resources in materials for assessing social-emotional skills 45% of children reported in category ‘e’ for statewide progress data, Outcome 1 $100 $100 $100 $200 $200 $200 Click over each cell to reveal an example of either evidence, inference, or action. Have participants identify which examples are evidence, inference, or action. $300 $300 $300 Early Childhood Outcomes Center

11 Evidence-Inference-Action
Use of Data: Activity Evidence-Inference-Action Early Childhood Outcomes Center

12 Continuous Program Improvement
Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Early Childhood Outcomes Center

13 Tweaking the System Reflect Check Plan (vision) Implement
Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

14 Early Childhood Outcomes Center
Continuous means… ….the cycle never ends. Early Childhood Outcomes Center

15 Outcome questions for program improvement, e.g.
Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

16 Examples of process questions
Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? Early Childhood Outcomes Center

17 Early Childhood Outcomes Center
Working Assumptions There are some high quality services and programs being provided across the state. There are some children who are not getting the highest quality services. If we can find ways to improve those services/programs, these children will experience better outcomes. Early Childhood Outcomes Center

18 Early Childhood Outcomes Center
Numbers as a tool Heard on the street “Why are we reducing children to a number?” So why do we need numbers? Early Childhood Outcomes Center

19 Early Childhood Outcomes Center
The need to aggregate data on children in a given classroom or caseload Early Childhood Outcomes Center

20 Early Childhood Outcomes Center
The need to aggregate across children within the school/ program Early Childhood Outcomes Center

21 Early Childhood Outcomes Center
The need to aggregate across districts/ programs Early Childhood Outcomes Center

22 Early Childhood Outcomes Center
The need to aggregate across the country Early Childhood Outcomes Center

23 Examining COS data at one time point
One group - Frequency Distribution Tables Graphs Comparing Groups Averages Early Childhood Outcomes Center

24 Distribution of COS Ratings in Fall
Outcome 1 Rating N % 7 350 70 6 110 22 5 20 4 8 1.6 3 1.2 2 .8 1 .4 We are using fake data for illustration Early Childhood Outcomes Center

25 Frequency on Outcome 1 - Fall
Early Childhood Outcomes Center

26 Frequency on Outcome 1 - Fall
Early Childhood Outcomes Center

27 Comparison of two classes - Fall
Early Childhood Outcomes Center

28 Frequency on Outcome 1 - Fall
Early Childhood Outcomes Center

29 Frequency on Outcome 1 – Class 1
Early Childhood Outcomes Center

30 Average Scores on Outcomes by Class – Fall, 2008
Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

31 Average Scores on Outcomes by Class – Fall, 2008
Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

32 Average Scores on Outcomes by Class – Fall, 2008
Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

33 Looking at change over time
Extent of change on rating scale The OSEP categories Developmental trajectories Maintaining Changing Early Childhood Outcomes Center

34 Extent of change on rating scale: Time 1 to Time 2
Outcome 1 Progress N % Maintained age-expected functioning 350 70 Maintained same level function, but not age-expected 60 12 Gained 3 steps 10 2 Gained 2 steps 25 5 Gained 1 step 50 Dropped 1 step 4 .8 Dropped 2 steps 1 .2 Early Childhood Outcomes Center

35 OSEP progress categories
Looking at information across time Reducing the information to fewer categories to allow easier comparisons Early Childhood Outcomes Center

36 OSEP Categories 2009 (%) 2010 (%) 2011 (%) 23 22 24 15 17 13 32 34 37
Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 Changed Traj – Closer to Age App 32 34 37 Same Trajectory -Progress 28 25 Flat Trajectory – No Prog. 2 1

37 OSEP Categories 2009 (%) 2010 (%) 2011 (%) 23 22 24 15 17 13 38 39 37
Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 TOTAL - Age Appropriate at Exit 38 39 37

38 OSEP Categories (%) 23 22 24 15 17 13 38 39 37 Class 1 (%) Class 2
Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 TOTAL - Age Appropriate at Exit 38 39 37

39 OSEP Categories 2009 (%) 2010 (%) 2011 (%) 15 17 18 32 34 37 47 51 55
Changed Traj – Age Appro 15 17 18 Changed Traj – Closer to Age App 32 34 37 TOTAL – Greater than Expected Progress 47 51 55

40 Early Childhood Outcomes Center
Working with data Different levels of analysis are required for different levels of questions Aggregation will work for you – but loses detail about individual children. 50 assessment items on 20 children in 5 classes in Fall and Spring 50 x 20 x 5 x 2 = 10,000 pieces of information Early Childhood Outcomes Center

41 Using assessment data at the classroom level
Looking at the data by child At a single point in time Over time Looking at data for areas that cut across children Early Childhood Outcomes Center

42 Items Related to Outcome 1
Example: Item Results for 5 Imaginary Children Name Items Related to Outcome 1 1 Plays well with others 2 Cooperates with peers in simple games 3 Stops for transition cues 4 Takes directions well from adults 5 Has at least one close friend Carlos A E Geeta NY Eileen Ming Shaniqua A=Accomplished; E= Emerging; NY= Not yet Early Childhood Outcomes Center

43 Example: COS Outcome Ratings for Class 3c by Child
Name Outcome 1 Outcome 2 Outcome 3 Time 1 Time 2 Carlos 5 6 3 4 Geeta 1 2 Eileen 7 Ming Shaniqua Early Childhood Outcomes Center

44 What do you see in these data?
Example of an Aggregated Report for Program: Percentage of Children Scoring 5 or Higher on COS by Class Class Outcome 1 Outcome 2 Outcome 3 Time 1 Time 2 1a 65 70 50 51 49 52 1b 55 53 62 61 87 88 2a 47 43 67 66 2b 76 84 78 85 83 3a 97 98 95 100 What do you see in these data? Early Childhood Outcomes Center

45 Outcome questions for program improvement, e.g.
Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

46 Looking at Data by Region
Percentage of Children Who Changed Developmental Trajectories After One Year of Service Class 1 Class 2 Class 3 45 47 23 Note: This doesn’t say percentage of children who moved ECO categories --- Have zeroed in on what you want to know… What do you think? Possible inference? Early Childhood Outcomes Center

47 Looking at Data by Age at Entry
Percentage of Children Who Changed Developmental Trajectories After One Year of Service 36 to 40 months 41 to 44 months 45 to 49 months 34 42 46 This is where the meaning gets important – possible interpretations.. Possible inference? Early Childhood Outcomes Center

48 Early Childhood Outcomes Center
Take Home Message You will want to look at your data in lots of different ways You will want to think about the possible inferences You may need other information to decide among possible inferences Act on what you have learned Early Childhood Outcomes Center

49 Tweaking the System Reflect Check Plan (vision) Implement
Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

50 How will/might these data be used?
Federal level Overall funding decisions (accountability) Resource allocation (e.g., what kind of TA to fund?) Decisions about effectiveness of program in individual states State level Program effectiveness?? Program improvement?? Local level Early Childhood Outcomes Center

51 Early Childhood Outcomes Center
Need for good data Encompasses all three levels: federal, state, local Depends on how well local programs are implementing procedures Early Childhood Outcomes Center

52 Many steps to ensuring quality data
Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses. Early Childhood Outcomes Center

53 Early Childhood Outcomes Center
Take Home Message If you conclude the data are not (yet) valid, they cannot be used for program effectiveness, program improvement or anything else. Inference = Data not yet valid Action = Continue to improve data collection and quality assurance Early Childhood Outcomes Center

54 Early Childhood Outcomes Center
Data Exploration Examine the data to look for inconsistencies If and when you find something strange, look for some other data you have that might help explain it. Is the variation caused by something other than bad data? Early Childhood Outcomes Center

55 Early Childhood Outcomes Center
Obtaining good data Focus on addressing the threats to good data Local providers do not understand the procedures Local providers do not follow the procedures And others….. Identify and address the threats Early Childhood Outcomes Center

56 How far along is our state?
Early Childhood Outcomes Center

57 Keeping our eye on the prize:
High quality services for children and families that will lead to good outcomes. Early Childhood Outcomes Center

58 The Early Childhood Outcomes Center
For more information…. The Early Childhood Outcomes Center


Download ppt "Looking at Data Presented by The Early Childhood Outcomes Center"

Similar presentations


Ads by Google