Presentation is loading. Please wait.

Presentation is loading. Please wait.

Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.

Similar presentations


Presentation on theme: "Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives."— Presentation transcript:

1 Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives Part B Meeting November 29 and December 1, 2016 Dallas, Texas

2 Session Overview Aligning your evaluation plan, theory of action, and logic model Evaluating implementation of professional development (PD) activities Evaluating impact of PD activities Providing an opportunity to: Hear from one state about their efforts to evaluate PD to support evidence-based practices (EBPs) Engage in a discussion about the successes and challenges of your own work

3 How Does Phase III Build Upon Previous Components of the State Systemic Improvement Plan (SSIP)?
Development of the SSIP: In Phase I, you created a theory of action. In Phase II, you developed an evaluation plan. You defined activities, outputs, and intended short- term and long-term outcomes. In Phase III, you are implementing activities and strategies from your SSIP, measuring implementation progress, and assessing impact toward outcomes. “You can't know where you're going unless you know where you've been." --English proverb

4 Let’s Revisit Your Theory of Action

5 Theory of Action Alignment to a Logic Model
Outcomes Implementation Connect the “If-Then” statements from the theory of action to logic model implementation and outcomes. Adapted from University of Wisconsin-Extension, 2010.

6 Evaluating Professional Development
PD to support EBP implementation Increased practitioner EBP knowledge and skill Improved practitioner EBP implementation Improved student outcomes Similar questions: Did you do it? Is it achieving results? Evaluate at any of these stages, starting earlier. PD cannot improve student outcomes without changes in adult behavior.

7 Implementation Outcomes Evaluation Questions
How do you know implementation is going as planned? Is your PD being implemented as planned? Are changes needed? Outcomes How do you know the implementation is impacting results? Is your PD impacting State-identified Measurable Results (SiMRs)? Evaluation questions can be process- or outcome-oriented. Process/implementation evaluation determines whether program activities have been implemented as intended. Outcome/effectiveness evaluation measures program effects in the target population, in our case PD, by assessing the progress in the outcomes or SIMR achievement. The theory of action can be expanded to a logic model, which takes “If-Then” statements and further details the evaluation plan by developing implementation and outcome questions (evaluation questions).

8 Questions to Consider When Evaluating PD activities
How do you know your PD is being implemented as planned? Audience Training Delivery Content How do you know if the PD is achieving intended outcomes? Knowledge/skills Behavior Implementation of learned practices Fidelity Implementation questions: Did you reach the intended audience? Did you deliver the training events as planned? Was the planned content delivered? Was the content delivered as planned? Outcomes from the perspective of trained staff: Did the participants believe they acquired knowledge or skills? Did the participants behavior change? “I will implement…” (behavior) Did the participants implement learned practices? Did the participants implement practices with fidelity?

9 Questions to Consider in Evaluating Implementation of PD Activities
What PD activities will you implement this year? What data will you collect to assess whether or not the PD was delivered as intended? Is the PD consistent across all sites? When will data be collected and analyzed? How will you determine that the data are valid and reliable? What is your plan for using the data? How have you engaged stakeholders? For example, the audience of the training events in terms of who, how many, and so on. How did you involve stakeholders in these decisions? Focus data collection and analysis efforts on what you’ll use! Are we on track? If so, how do we know? If not, what should we change? Overarching questions: What data will be collected to measure implementation progress? How will the data be analyzed to measure implementation progress? How will you determine that the data are valid and reliable?

10 Questions to Consider in Evaluating Progress Toward Outcomes
How will you measure the impact of PD on practitioner knowledge of the EBP? How will you know if practitioner behavior changed as a result of the PD? What strategies will you use to measure fidelity of implementation of the EBP? Who will be responsible for analyzing the data? What is your plan for using the data? How have you engaged stakeholders? How did you involve stakeholders in these decisions?

11 Questions to Consider in Evaluating Progress Toward Outcomes
Participant Reaction Are the participants satisfied with the PD experience? Survey, PD exit evaluation Participant Learning Did the participants acquire the intended knowledge or skills? Survey, PD exit evaluation, pretest/posttest Organization Support Were resources made available to support participant knowledge development? Course materials Participant Use of Knowledge Did the participants (effectively) apply the new knowledge? Intervention fidelity rubrics Student Learning Outcomes What was the impact on students? Formative assessments, statewide assessments, screening tools How do we use data at each level to inform progress? Adapted from Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin

12 One State’s Journey NOTE TO FACILITATOR – PLEASE INTRODUCE THE MEMBERS OF THE TEAM WHO WILL BE TALKING. Ann Alexander, Will Jensen, and Julie Bowers

13 Discussion—Guiding Question 1
What PD strategy(ies)/activity(ies) will you evaluate this year? Why did you choose the activities? Did you make changes or adjustments from your original plan? Why are the activities critical for making progress toward the SiMR? How did you engage stakeholders in the process? What PD strategies/activities will you evaluate this year, and why are these activities critical for making progress toward the SiMR?

14 Discussion—Guiding Question 2
What are your short-term outcomes for PD activities? What are your long-term outcomes for PD activities? What are the critical evaluation questions to measure these outcomes? What data will be collected and analyzed to measure implementation progress? How will you determine that the data are valid and reliable? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

15 Discussion—Guiding Question 3
What data will be collected to measure implementation progress? How will the data be analyzed to measure implementation progress? How will you determine that the data are valid and reliable? What data will be collected and analyzed to measure implementation progress? How will you determine that the data are valid and reliable? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

16 Discussion—Guiding Question 4
How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

17 Discussion and Questions
Note: On the previous four slides, we provide some guiding questions to orient and facilitate the discussion. On this final slide, I would like to leave it open-ended (i.e., no prompt) to provide an opportunity for participants to offer any final thoughts, ask whatever questions might be remaining, and so on.

18 Resources and Tools Phase III Tools:
Implementation Evaluation Matrix: This resource will assist states in addressing the SSIP requirements, which call for the evaluation of implementation as well as outcomes. Phase III Guidance Tool: This guidance tool is to assist states in developing, implementing, and evaluating their SSIPs.  Phase III Organizational Report Outline: This is an optional report outline developed in response to state requests for a Phase III writing guide or template. Implementation Evaluation Matrix: This resource was designed by the National Center for Systemic Improvement (NCSI) to provide states with a sample approach and tool to plan and track measures of State Systemic Improvement Plan (SSIP) implementation. This resource will assist states in addressing the SSIP requirements laid out in the State Performance Plan/Annual Performance Report (SPP/APR) Part B and Part C Indicator Measurement Tables, and the SSIP Phase II OSEP Guidance and Review Tool, which call for the evaluation of implementation as well as outcomes. Phase III Guidance Tool: The Office of Special Education Programs (OSEP) Performance Accountability Implementation Team has developed this SSIP Evaluation Plan Guidance Tool to assist states in developing, implementing, and evaluating their SSIPs.  Phase III Organizational Report Outline: The OSEP Performance Accountability Implementation Team has developed an SSIP Phase III Organizational Report Outline to assist states in developing their SSIP Phase III submission due to OSEP on April 3, 2017.

19 NCSI Presenters and Contributors:
For additional information, support, and technical assistance: • Contact your NCSI TA facilitator • Submit your question on Ask the NCSI • Contact Kristin Ruedel, NCSI Data Use  Service Area Lead, at NCSI Presenters and Contributors: Candice Bocala Cesar D’Agord Pakethia Harris and Kristin Ruedel

20 THANK YOU! http://ncsi.wested.org | @TheNCSI


Download ppt "Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives."

Similar presentations


Ads by Google