Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

Analyzing Student Work
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
The Blueprint Your SIP (School Improvement Plan) A living, breathing, document.
Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Expert Panel Discussion: Measuring Fidelity as a Method for Evaluating Technical Assistance Technical Assistance & Dissemination Program Meeting Jill Lammert.
Using State Data to Inform Parent Center Work. Region 2 Parent Technical Assistance Center (PTAC) Conference Charleston, SC June 25, 2015 Presenter: Terry.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
An Update of One Aspect of Monitoring, Support and Technical Assistance Available Through the State Department of Education, Bureau of Special Education.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Are we there yet? Evaluating your graduation SiMR.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Where’s Your Focus What we FOCUS on is what IMPROVES 2.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Logic Models How to Integrate Data Collection into your Everyday Work.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
SSIP Implementation: Alignment & Evaluation Across the State System
Phase I Strategies to Improve Social-Emotional Outcomes
Rorie Fitzpatrick & Dona Meinders, WestEd
Pacific and Caribbean States/Entities Early Intervention and
State Systemic Improvement Plan (SSIP) Rubric
Using Formative Assessment
Phase III: Evaluating Your First Year of Implementation
Using Logic Models in Program Planning and Grant Proposals
As use of 619 data increases in state accountability systems, new and challenging issues for data sharing/governance arise particularly as these data are.
Charting the Course: Monitoring Progress Towards SIMR Attainment
TAIS Overview for Districts
Zelphine Smith-Dixon, State Director of Special Education
Kristin Reedy, Co-Director June 24, 2016
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
State Systemic Improvement Plans
Measuring Project Performance: Tips and Tools to Showcase Your Results
2016 Improving Data, Improving Outcomes Conference
SPDG/SSIP TA Alignment, Implementation & Evaluation
Strategic Planning Setting Direction Retreat
NCSI Cross-State Learning Collaboratives Part B Meeting
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Empowering Effective Implementation of Evidence-Based Practices
General Notes Presentation length - 10 – 15 MINUTES
PLCs Professional Learning Communities Staff PD
2018 OSEP Project Directors’ Conference
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
Implementation, Monitoring, and NM DASH
Assessing Professional Learning via Data Management Systems
A3 – Improving State Level Supports and Stakeholder Engagement through Effective Evaluation Kim Gulbrandson, Justyn Poulos – Wisconsin RtI Center Key.
Chicago Public Schools
Grantee Guide to Project Performance Measurement
Introductions Introduction
Introduction Introduction
Using Data for Program Improvement
K–8 Session 1: Exploring the Critical Areas
Using Data for Program Improvement
Overview of the Kansas Technical Assistance System Network: Using Technical Assistance to Facilitate Implementation of Evidence-based Practices Kerry.
Introductions Introduction
Presenters: Ravyn Hawkins, Arkansas Department of Human Services
An Introduction to Evaluating Federal Title Funding
Measuring Child and Family Outcomes Conference August 2008
Teacher Evaluator Student Growth Retraining Academy
Special Ed. Administrator’s Academy, September 24, 2013
Using Data to Build LEA Capacity to Improve Outcomes
Presentation transcript:

Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives Part B Meeting November 29 and December 1, 2016 Dallas, Texas

Session Overview Aligning your evaluation plan, theory of action, and logic model Evaluating implementation of professional development (PD) activities Evaluating impact of PD activities Providing an opportunity to: Hear from one state about their efforts to evaluate PD to support evidence-based practices (EBPs) Engage in a discussion about the successes and challenges of your own work

How Does Phase III Build Upon Previous Components of the State Systemic Improvement Plan (SSIP)? Development of the SSIP: In Phase I, you created a theory of action. In Phase II, you developed an evaluation plan. You defined activities, outputs, and intended short- term and long-term outcomes. In Phase III, you are implementing activities and strategies from your SSIP, measuring implementation progress, and assessing impact toward outcomes. “You can't know where you're going unless you know where you've been." --English proverb

Let’s Revisit Your Theory of Action

Theory of Action Alignment to a Logic Model Outcomes Implementation Connect the “If-Then” statements from the theory of action to logic model implementation and outcomes. Adapted from University of Wisconsin-Extension, 2010.

Evaluating Professional Development PD to support EBP implementation Increased practitioner EBP knowledge and skill Improved practitioner EBP implementation Improved student outcomes Similar questions: Did you do it? Is it achieving results? Evaluate at any of these stages, starting earlier. PD cannot improve student outcomes without changes in adult behavior.

Implementation Outcomes Evaluation Questions How do you know implementation is going as planned? Is your PD being implemented as planned? Are changes needed? Outcomes How do you know the implementation is impacting results? Is your PD impacting State-identified Measurable Results (SiMRs)? Evaluation questions can be process- or outcome-oriented. Process/implementation evaluation determines whether program activities have been implemented as intended. Outcome/effectiveness evaluation measures program effects in the target population, in our case PD, by assessing the progress in the outcomes or SIMR achievement. The theory of action can be expanded to a logic model, which takes “If-Then” statements and further details the evaluation plan by developing implementation and outcome questions (evaluation questions).

Questions to Consider When Evaluating PD activities How do you know your PD is being implemented as planned? Audience Training Delivery Content How do you know if the PD is achieving intended outcomes? Knowledge/skills Behavior Implementation of learned practices Fidelity Implementation questions: Did you reach the intended audience? Did you deliver the training events as planned? Was the planned content delivered? Was the content delivered as planned? Outcomes from the perspective of trained staff: Did the participants believe they acquired knowledge or skills? Did the participants behavior change? “I will implement…” (behavior) Did the participants implement learned practices? Did the participants implement practices with fidelity?

Questions to Consider in Evaluating Implementation of PD Activities What PD activities will you implement this year? What data will you collect to assess whether or not the PD was delivered as intended? Is the PD consistent across all sites? When will data be collected and analyzed? How will you determine that the data are valid and reliable? What is your plan for using the data? How have you engaged stakeholders? For example, the audience of the training events in terms of who, how many, and so on. How did you involve stakeholders in these decisions? Focus data collection and analysis efforts on what you’ll use! Are we on track? If so, how do we know? If not, what should we change? Overarching questions: What data will be collected to measure implementation progress? How will the data be analyzed to measure implementation progress? How will you determine that the data are valid and reliable?

Questions to Consider in Evaluating Progress Toward Outcomes How will you measure the impact of PD on practitioner knowledge of the EBP? How will you know if practitioner behavior changed as a result of the PD? What strategies will you use to measure fidelity of implementation of the EBP? Who will be responsible for analyzing the data? What is your plan for using the data? How have you engaged stakeholders? How did you involve stakeholders in these decisions?

Questions to Consider in Evaluating Progress Toward Outcomes Participant Reaction Are the participants satisfied with the PD experience? Survey, PD exit evaluation Participant Learning Did the participants acquire the intended knowledge or skills? Survey, PD exit evaluation, pretest/posttest Organization Support Were resources made available to support participant knowledge development? Course materials Participant Use of Knowledge Did the participants (effectively) apply the new knowledge? Intervention fidelity rubrics Student Learning Outcomes What was the impact on students? Formative assessments, statewide assessments, screening tools How do we use data at each level to inform progress? Adapted from Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin

One State’s Journey NOTE TO FACILITATOR – PLEASE INTRODUCE THE MEMBERS OF THE TEAM WHO WILL BE TALKING. Ann Alexander, Will Jensen, and Julie Bowers

Discussion—Guiding Question 1 What PD strategy(ies)/activity(ies) will you evaluate this year? Why did you choose the activities? Did you make changes or adjustments from your original plan? Why are the activities critical for making progress toward the SiMR? How did you engage stakeholders in the process? What PD strategies/activities will you evaluate this year, and why are these activities critical for making progress toward the SiMR?

Discussion—Guiding Question 2 What are your short-term outcomes for PD activities? What are your long-term outcomes for PD activities? What are the critical evaluation questions to measure these outcomes? What data will be collected and analyzed to measure implementation progress? How will you determine that the data are valid and reliable? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

Discussion—Guiding Question 3 What data will be collected to measure implementation progress? How will the data be analyzed to measure implementation progress? How will you determine that the data are valid and reliable? What data will be collected and analyzed to measure implementation progress? How will you determine that the data are valid and reliable? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

Discussion—Guiding Question 4 How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts? How will you measure the impact of your PD efforts? How will you use the data to drive decisions about future PD efforts?

Discussion and Questions Note: On the previous four slides, we provide some guiding questions to orient and facilitate the discussion. On this final slide, I would like to leave it open-ended (i.e., no prompt) to provide an opportunity for participants to offer any final thoughts, ask whatever questions might be remaining, and so on.

Resources and Tools Phase III Tools: Implementation Evaluation Matrix: This resource will assist states in addressing the SSIP requirements, which call for the evaluation of implementation as well as outcomes. Phase III Guidance Tool: This guidance tool is to assist states in developing, implementing, and evaluating their SSIPs.  Phase III Organizational Report Outline: This is an optional report outline developed in response to state requests for a Phase III writing guide or template. Implementation Evaluation Matrix: This resource was designed by the National Center for Systemic Improvement (NCSI) to provide states with a sample approach and tool to plan and track measures of State Systemic Improvement Plan (SSIP) implementation. This resource will assist states in addressing the SSIP requirements laid out in the State Performance Plan/Annual Performance Report (SPP/APR) Part B and Part C Indicator Measurement Tables, and the SSIP Phase II OSEP Guidance and Review Tool, which call for the evaluation of implementation as well as outcomes. Phase III Guidance Tool: The Office of Special Education Programs (OSEP) Performance Accountability Implementation Team has developed this SSIP Evaluation Plan Guidance Tool to assist states in developing, implementing, and evaluating their SSIPs.  Phase III Organizational Report Outline: The OSEP Performance Accountability Implementation Team has developed an SSIP Phase III Organizational Report Outline to assist states in developing their SSIP Phase III submission due to OSEP on April 3, 2017.

NCSI Presenters and Contributors: For additional information, support, and technical assistance: • Contact your NCSI TA facilitator • Submit your question on Ask the NCSI • Contact Kristin Ruedel, NCSI Data Use  Service Area Lead, at kruedel@air.org NCSI Presenters and Contributors: Candice Bocala (cbocala@wested.org), Cesar D’Agord (cdagord@wested.org), Pakethia Harris (pharris@air.org), and Kristin Ruedel (kruedel@air.org)

THANK YOU! http://ncsi.wested.org | @TheNCSI