Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Data for Program Improvement

Similar presentations


Presentation on theme: "Using Data for Program Improvement"— Presentation transcript:

1 Using Data for Program Improvement
Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010 Early Childhood Outcomes Center

2 Systems Thinking: Systems are made up of interrelated, interconnected components Systems change involves changing the capacity, interrelationships, and interdependencies among parts, levels and stakeholders Desired changes in one part or level of the system must be accompanied by changes in other parts or levels Early Childhood Outcomes Center

3 SPP as long term plan for systems change:
EI/ECSE are systems of complex, interrelated components with goal of achieving outcomes for children and families Changes to EI/ECSE systems require a combination of improvement activities that are interconnected and support changes to infrastructure that work together to achieve the desired results Early Childhood Outcomes Center

4 Developing improvement activities:
Effective Policies, Procedures, and MOU Data Systems and Monitoring TA and Professional Development Fiscal Management Evaluation Early Childhood Outcomes Center

5 Evaluating SPP/APR Improvement Activities
Early Childhood Outcomes Center

6 The Need for Evaluation
Provides an organized way of assessing work in progress and results obtained Assesses the impact of an activity on the area targeted for improvement Identifies strengths and weaknesses in the implementation of the improvement activity Keep in mind that it is important to develop a comprehensive plan for improvement, but it is equally important to assess the effectiveness of implementing that plan and the impact of the improvement efforts thru your improvement activities. The purposes for conducting evaluation are numerous and can vary depending on the needs of the agency. Thinking critically about evaluation is the key to ensuring that the agency will obtain information that accurately reflects the progress and impact of the improvement activity. Evaluation can help the agency team to judge whether the identified improvement activity has been successful in strengthening the designated area for improvement. The evaluation can: Provide an organized way of assessing work in progress and results obtained. Assess the impact of an activity on the area targeted for improvement-i.e., specific indicators. Identify strengths and weaknesses in the implementation of the improvement activity.

7 Types of Evaluation Process evaluation
Evaluates the improvement process itself Impact evaluation Evaluates the results produced by the process There are two types of evaluation that can and should be used to evaluate improvement activities : 1.Evaluating the improvement process itself (Process Evaluation) 2.Evaluating the results of the improvement process (Impact Evaluation)

8 Process Evaluation Questions
Process evaluation questions might answer To what extent is the improvement activity being implemented as intended? To what extent is the improvement activity reaching the target audience (i.e., children, staff, parents)? Is everyone doing what they said they would do? Are resources still available to adequately support this improvement activity? Some questions can be answered through a process evaluation include: To what extent is the improvement activity being implemented as intended? To what extent is the improvement activity reaching the target audience (i.e., children, staff, parents)? Is everyone doing what they said they would do?

9 Impact Evaluation Questions
Impact evaluation questions might include Did the improvement activity accomplish what it was supposed to? Which parts of the improvement activity worked well? Which parts of the improvement activity did not work well? Should the agency continue the improvement activity? What has changed as a result of implementing the improvement activity? Some questions to Consider when conducting an impact evaluation of an improvement activity: Did the improvement activity accomplish what it was supposed to? Which parts of the improvement activity worked well? Which parts of the improvement activity did not work well? Should the agency continue the improvement activity? What has changed as a result of implementing the improvement activity?

10 Reviewing Improvement Activities
Improvement activities are aligned to the indicators, including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. So before thinking about the entire evaluation plan you may want to assess what it is you already know about your improvement activities. Improvement activities are aligned to the indicators-including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. Improvement activities include measures of performance. Improvement activities include time lines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs.

11 Reviewing Improvement Activities
Improvement activities include measures of performance. Improvement activities include timelines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs. So before thinking about the entire evaluation plan you may want to assess what it is you already know about your improvement activities. Improvement activities are aligned to the indicators-including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. Improvement activities include measures of performance. Improvement activities include time lines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs.

12 Categorizing Improvement Activities
Training and Professional Development Improve Data Collection Improve Systems Administration and Monitoring Improve Collaboration and Coordination Program Development Clarify/Examine/Develop Policies & Procedures Provide Technical Assistance Increase/Adjust FTE Evaluation Another way of thinking about how to assess the effectiveness of your improvement activities is to try to categorize them prior to evaluating them. This may help in looking overall at your improvement activities. In most instances, improvement activities can be categorized in the following ways: Training and Professional Development Improve Data Collection Improve Systems Administration and Monitoring Improve Collaboration and Coordination Program Development Clarify/Examine/Develop Policies & Procedures Provide Technical Assistance Increase/Adjust FTE Evaluation Because it may too daunting of a task to try to evaluate every single improvement activity included in your SPP/APR, you may want to focus your evaluation on one or two of the categories that seem to be most prevalent. For example, you may want to start out by looking system wide at training and PD and Program Development. Try to determine which category contains the most improvement activities and begin with that.

13 Developing a Plan for Evaluation
Identify the goal of the evaluation. Frame the evaluation questions to be answered. Identify evaluation methods and measurement options. Identify data sources. Determine data analysis techniques. Establish timelines. There are several steps that needs to completed in order to develop an evaluation plan for improvement efforts. These steps include: Identify the goal of the evaluation. Frame the evaluation questions to be answered. Identify evaluation methods and measurement options. Identify data sources. Determine data analysis techniques. Establish timelines. I will discuss each step in more depth on the following slides.

14 What’s Next? Moving beyond data quality
Understanding and manipulating the data you have Using data to make program improvements

15 Looking at Data: To better understand issues and areas of concern to focus improvement activities... What does the data analysis say? What are the ‘root causes’ of issues or challenges? Early Childhood Outcomes Center

16 Using data for improvement
Evidence Inference Action Early Childhood Outcomes Center

17 Evidence Evidence refers to the numbers, such as
“89% of families reported ...” The numbers are not debatable Early Childhood Outcomes Center

18 Inference How do you interpret the #s?
What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

19 Action Given the inference from the numbers, what should be done?
Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Early Childhood Outcomes Center

20 At the state level – TA, policy
Program improvement: At the state level – TA, policy At the regional or local level – supervision, guidance At the service/classroom level– implement high quality individualized family centered services Different program improvement levers at different levels. Going to be focusing primarily on the state level use of information. Some state applications translate directly to smaller units. How interventionists or teacher use outcome data for program improvement is a completely different topic – very important but we are not going to cover it here. Early Childhood Outcomes Center

21 Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning You cannot prevent the misuse of data but you can set up conditions to make it less likely. Early Childhood Outcomes Center

22 Continuous Program Improvement
Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Early Childhood Outcomes Center

23 Tweaking the System Reflect Check Plan (vision) Implement
Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

24 Outcome questions for program improvement, e.g.
Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Early Childhood Outcomes Center

25 Looking at Family Outcomes by Subgroups
System Characteristics Family Characteristics Child Characteristics Service Characteristics Early Childhood Outcomes Center

26 Are there differences in outcomes across family characteristics?
Race/ethnicity Family Income Primary language Family structure Etc Early Childhood Outcomes Center

27 Are there differences in outcomes across child characteristics?
Race/ethnicity Type of disability Length of time in services Etc Early Childhood Outcomes Center

28 Examples of process questions
Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? Early Childhood Outcomes Center

29 Working Assumptions There are some high quality services and programs being provided across the state. There are some families who are not getting the highest quality services. If we can find ways to improve those services/programs, these families will experience better outcomes. Early Childhood Outcomes Center

30 Action Given the inference from the numbers, what should be done?
Develop improvement activities that are: Targeted based on data analysis Based on evidence based practices Interconnected, work together to accomplish the desired result Early Childhood Outcomes Center

31 Small Group Scenarios Early Childhood Outcomes Center

32 Small Group Scenarios Each table is assigned a scenario
Choose a table / scenario As a group, walk through the scenario. Discuss and answer questions. Jot down your ideas and be prepared to share back some highlights. Early Childhood Outcomes Center


Download ppt "Using Data for Program Improvement"

Similar presentations


Ads by Google