Presentation is loading. Please wait.

Presentation is loading. Please wait.

SSIP Evaluation Workshop 2

Similar presentations


Presentation on theme: "SSIP Evaluation Workshop 2"— Presentation transcript:

1 SSIP Evaluation Workshop 2
SSIP Evaluation Workshop 2.0: Taking the Online Series to the Next Level Evaluating Infrastructure Breakout Improving Data, Improving Outcomes Pre-Conference August 14, 2018

2 State Groupings for Breakout Sessions
Salon F: Practices GA, MA, LA CO, UT, AR CT, PA, ID-B HI, ID-C IL, WY Salon E: Infrastructure CT, IL, CO GA, FL

3 Expected Outcomes Participants will increase awareness of:
Existing tools to measure infrastructure outcomes Considerations for selecting or adapting a tool to measure results of infrastructure improvements Using multiple methods to evaluate infrastructure outcomes How one state adjusted their evaluation plan to measure infrastructure improvements, including selecting tools

4 Evaluating Infrastructure Improvements
Evaluate progress: How is implementation going? Not simply describing the activities that were implemented but relate them to the initial analysis Reporting on benchmarks or other indicators of system change Evaluate outcomes: What changes are we seeing? What’s the impact of those changes? How will the infrastructure support local Early Intervention Programs to implement EBPs? How will the infrastructure support scaling up and/or sustainability? Progress toward the SiMR is the ultimate goal, but we all know that is going to take some time. So in addition, you’re looking at progress in implementing the SSIP. This helps answer the question: What is descriptively and operationally different about your system at the end of the SSIP cycle? What will things look like when you’ve changed/improved your infrastructure? What do you want to know about the change?

5 "To measure an outcome is to measure the end result, not the work involved in getting there".

6 Definitions: Outputs and Outcomes
Outputs: Direct, observable evidence that an activity has been completed as planned Outcomes: Statement of the benefit or change you expect as a result of the completed activities. Outcomes can vary based on two dimensions: When you would expect the outcomes to occur, i.e., short-term, intermediate or long-term (impact); and The level at which you are defining your outcome, e.g., state level, local/program level, practitioner, child/family. Focus primarily on the difference between outputs and outcomes. Infrastructure can happen at state and local level. For more information, see key terms and definitions in Evaluating Infrastructure Improvements Session 1 Pre-Work:

7 Example: Finance Activity: Develop and implement a plan to improve EI finance system to access additional Medicaid funds. Output: Finance plan Outcome: ???? What do you want your system to look like as a result of developing and implementing the finance plan to increase access to additional Medicaid funds? Performance indicator: ??? How will you know you achieved the outcome? Describe that a state has an issue with not accessing all available Medicaid funds for EI. As a result this state is implementing an activity related to developing and implementing a plan to improve the EI finance system to access additional Medicaid funds. They have identified an output: having a Finance Plan developed. Ask the participants: What might be an outcome of this activity? The question the group should consider in coming up with a potential outcome is: What do you want your system to look like as a result of developing and implementing the finance to increase access to Medicaid? One the group identified a potential outcome, ask them to identify a potential performance indicator. The questions they should consider in developing the performance indicator is: How will you know you achieved the outcome? If the group falters, here is a possible short term outcome and performance indicators: Outcome: The number of EI providers that are approved as Medicaid providers increases Performance indicators: 80% of EI providers will initiate steps to become Medicaid providers by February 2018 70% of all EI providers will be approved as Medicaid providers by December 2018 Here is a possible intermediate outcome and performance indicators: Outcome: The finance system will be enhanced to support and fund the EI system. 90% of the QIs in ECTA System Framework Finance subcomponents 1, 2 and 4 will have more than 50% of elements fully implemented by February 2019. The total funding available to state EI will increase by $$$$$ by March 2019

8 Determining Data Collection Approach
Start by considering existing tools relevant to your infrastructure improvement (e.g., ECTA System Framework, model developer tools, other frameworks) For ECTA System Framework: Is there a component that aligns? If so, is there a subcomponent or quality indicator that aligns? Does the tool measure what you want it to measure? If not, can it be adapted? Will it measure improvements over time? What data do you already have (e.g., fiscal, personnel, accountability data) that can be used with the tool or will you need to collect new data? What additional data could you collect to better understand infrastructure improvement (e.g., qualitative data)? In determining their data collection, states need to take into account their capacity to do this work. Also, a tool might align well with what you want to measure but it doesn’t align with your outcome or performance indicator in your evaluation plan. As a result you may want to use the tool and revise your outcome and performance indicator so they align with each other.

9 Existing Tools for Evaluating Infrastructure
ECTA System Framework State or Local Child Outcomes Measurement Framework Benchmarks of Quality for Home-Visiting Programs Model developer infrastructure tools See Evaluating Infrastructure Improvements Session 2 Pre-Work:

10 ECTA System Framework: Quality Indicators/ Elements of Quality
This is example of one Quality indicator and several of the Elements of Quality that are under a subcomponent of Inservice PD and TA in the Personnel/Workforce Component of the System Framework. This is screen shot of the Self-Assessment. (Explain that multiple stakeholders, through consensus, establish a rating for each Element of Quality based upon the available evidences for the Element. A rating for the quality indicator is automatically calculated from the ratings for each of the Elements. Note; I don’t think it is necessary to describe the ratings e.g. 1-4 for the Elements and 1-7 for the Quality Indicators unless there are questions – this is on the next slide) The Personnel/Workforce component includes other subcomponents including: state personnel standards, preservice personnel development, recruitment and retention

11 Measuring Improvement: Using Framework Self-Assessment Tools
Measure change over time: from Time 1 to Time 2 Compare QI ratings, e.g., Time 1 = 3, Time 2 = 5 Compare percent of elements fully implemented, e.g., Time 1 = 20%, Time 2 = 50% Compare to a standard QI rating = 6, at least 50% are fully implemented, the rest are partially implemented At least 50% of the elements are fully implemented Quality Indicator rating scale, 1 to 7: none to all fully implemented Benchmark: A standard against which a program’s results and progress can be compared. A benchmark is a similar measure for a similar group against which progress can be gauged. A QI rating of 7 would be the highest – all elements are fully implemented; which of these two sets a higher standard? In addition to quantitative descriptors in SSIP report, highlight examples of accomplishments achieved.

12 Considerations for Tool Selection or Adaptation
Is the tool aligned with the infrastructure improvements you are implementing? If not, could it be adapted? Is it measuring what you want to measure? Is it practical to administer? Number of items Time required Can it be implemented consistently across those using the tool? Clarity of instructions and items Does the tool allow for enough variation to measure different degrees of progress? Does the tool provide useful information (e.g. data to determine if modifications to improvement activities are needed)? Measurement – does the tool measure your infrastructure improvements? Variation – need enough variation in the scoring scale (1-4, 1-5, etc.) to be able to measure different degrees of progress. Useful information for continuous improvement

13 Decision Points for Adapting Tool
Design of the tool Phrasing of items – single concept Phrasing of items – clarity Selecting the response options Pilot testing the measure Method for rating Recorded sessions (if applicable) Randomization process (if applicable) Raters Training for raters Feely et al (2018) Design of the tool – format and content Phrasing of items – single concept – no double-barreled questions – break them into multiple questions Phrasing of items – clarity – avoid jargon, operationalized and clear Selecting the response options – yes/no, scale, other Pilot testing the measure – will you pilot the measure and revise it to ensure it can be used consistently (reliably)? Method for rating – live observation, video recording, self-assessment, other Recorded sessions (if applicable) – will any or all events be recorded? – Randomization process (if applicable) – will you randomize who is observed or which observations get rated? (not applicable for infrastructure) Raters – who are they, how many, and who do they rate Training for raters – what training/practice will be provided for raters on the tool and process? Will they meet interrater reliability before rating? Icon made by Freepik for flaticon.com

14 Considerations for Using the Tool
Who participates (e.g. stakeholder groups, local programs, state staff)? How will information be collected (e.g., data system, checklist, self-rating scale, behavioral observation, interviews)? Online or hard-copy? Will data need to be collected from comparison groups? If so, will it be through pre- and post- collections? When will data collection happen? Is it easy to administer? Is training needed?

15 State X Example: Infrastructure Evaluation Challenges
Implementing a variety of improvement activities related to: In-service PD system Local program infrastructure to support implementation of EBPS Child outcome measurement system Only measuring progress of infrastructure improvement through outputs (e.g. not measuring infrastructure improvements outcomes) Uncertain about available tools to measure infrastructure improvements and how to select or adapt them Limited state and local program staff time to adapt/develop tools and collect data

16 State X: In-service PD Improvement Activities
Enhancing their in-service PD system by developing: provider competencies training materials procedures to sustain coaching with new providers State X had identified issues with their inservice professional development system and as a result implemented improvement activities such as developing provider competencies, developing training materials, and developing procedures to sustain coaching of new providers, etc. Since state X had not developed outcomes to measuring results of these infrastructure improvements in their evaluation plan, their next step was to develop the outcome, evaluation questions, performance indicator and determine what tool they would use to measuring improvements.

17 State X Outcome Evaluation of In-service PD
Outcome Type Outcome Evaluation Question(s) How will we know (Performance Indicator) Measurement/ Data Collection Method Timeline/ Measurement Intervals Analysis Description State System-Level: Intermediate A sustainable statewide system is in place to support high-quality personnel development and technical assistance a. Has the statewide system for in-service personnel development and technical assistance improved (incremental progress)? b. Does the state have a quality system for in-service personnel development and technical assistance? a.  The QI ratings for Indicator PN7 in the in-service personnel development subcomponent will have a QI rating of 5 in 2018 b.  The Quality Indicator PN7 for the in-service personnel development subcomponent will have a QI rating of 6 or 7 in 2019 System Framework Self-Assessment on in-service personnel development and technical assistance (Personnel/Work-force, subcomponent 4 – PN7) a. 3/18 b. Post measure 3/19 a. Compare the automatic calculated QI self-assessment score for PN7 to a rating of 5 in 3/18 b. Compare the automatically calculated QI self-assessment score for PN7 to a rating of 6 or 7 in 3/19 With TA support, State X developed an outcome for their in-service PD system, along with several evaluation questions and performance indicators. After exploring what existing tool were available, what data they had, and their capacity, State X determined they would use one Quality Indicator (PN7) and its Elements of Quality from the Personnel/Workforce Component of the System Framework to measure the results of their in-service PD improvement efforts. The state was initially thinking they would need to do focus groups to collect outcome data but they acknowledged time limitations on the part of state staff as well other key stakeholders to collect this kind of data. The state realized they had available evidences (e.g. existing data) to support their work in the Elements of Quality for PN7 and knew they had the capacity to complete this portion of the Framework Self-Assessment. You will see they are measuring the results their in-service PD improvement activities based upon one standard in 2018 and a higher standard in 2019 (e.g. PN7 will have a rating of 5 in 2018 and a 7 in 2019.) Ask: If the state wanted to go deeper into what was working, not working with their inservice PD system, what methods could the state use to supplement the existing data from the Framework Self-Assessment? Answer: One possibility would be conducting focus groups?

18 State X: Local Infrastructure Improvement
Improvement Activity: Supporting demonstration sites in establishing the necessary personnel infrastructure to implement Coaching in Natural Learning Environment EBPs (Shelden and Rush) Outcome: EI Demonstration Sites will have the team structure necessary to implement EBP (Coaching in Natural Learning Environments) Tool: Checklist for Implementing a Primary Coach Approach to Teaming (Shelden & Rush) State X was also implementing activities to support each local demonstration sites to ensure they had the necessary personnel infrastructure to implement EBPs related to Coaching in Natural Learning Environments (Shelden and Rush). State X wanted to measure the results of these infrastructure improvements at the local level. They considered what they wanted their local demonstration sites to look like as a result of implementing these improvement activities to establish the outcome of: EI Demonstration Sites will have the team structure necessary to implement EBP (Coaching in Natural Learning Environments) State X explored what existing tools were available, especially those infrastructure tools developed by Rush and Shelden, and discussed if the tools would measure their outcome. They discussed the possibility of adapting an existing tool or developing a new. Regardless of whether they used an existing tool, an adapted tool or they developed a new tool, they realized they would have to collect new data from the local implementation sites since they did have this data available. As a result, they looked at the ease of use of the existing tools. State X decided to use the Checklist for Implementing a Primary Coach Approach to Teaming developed by Rush and Shelden to measure local personnel infrastructure improvements.

19 State X: Improving Child Outcome System
Improvement Activities: Improving child outcome measurement system (e.g. developing new COS resources to support consistent COS ratings, developing family materials on COS process, developing processes for EI program’s ongoing use of COS data, revising COS training materials) Outcome: The state has an improved system for Child Outcome Measurement Tool: State Child Outcomes Measurement System Framework Self-Assessment [Data Collection, Analysis, and Using Data] Another infrastructure area State X was developing significant time in improving their child outcomes measurement system. They were developing new ECO resources to support consistent COS ratings, developing family materials on the ECO process, establishing processes that support EI program’s ongoing use of ECO data for program improvement, and revising ECO training materials. These activities crossed multiple infrastructure components including Data System and Personnel/Workforce. Similar to their previous process, the state discussed what they wanted their system to look like when these improvements activities were implemented. They initially discussed having multiple outcomes but considered their limited capacity to measure multiple outcomes. As a result, they landed on one outcome especially after exploring what existing tools would measure what they hoped to achieve with these infrastructure improvements. They had previously completed portions of the State Child Outcomes System Framework prior to implementation of these improvement activities so they had time 1 data already available. State X determined they had the capacity to complete sections of the State Child Outcomes System Framework annually to measure ongoing progress with their COS infrastructure.

20 Questions

21 State Work Time

22 How we will Work Together
Today is a conversation Ask questions Tell us what you want to work on Tell us how we can support you going forward We asked you what you about your priorities for today. These priorities may have changed. We hope that you will be able engage in conversations with others states. This Photo by Unknown Author is licensed under CC BY

23 Optional Worksheets for State Work Time
Evaluation Plan Worksheet Selecting an Infrastructure Tool Worksheet Decision Points for Adapting a Tool Worksheet

24 Key Resources Definitions:
Evaluating Infrastructure Improvements Session 1 Pre-Work: Tools for evaluating infrastructure improvements: Evaluating Infrastructure Improvements Session 2 Pre-Work: Questions to refine evaluation, including data collection: Refining Your Evaluation: Data Pathway – From Source to Use:

25 Contact Information Christina Kasprzak, ECTA Ardith Ferguson, NCSI Sherry Franklin, ECTA


Download ppt "SSIP Evaluation Workshop 2"

Similar presentations


Ads by Google