Region 1 PTAC Regional Conference

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

Theme by Richard Strauss…from 2001 A Space Odyssey, 1968: Also Sprach Zarathrustra State Systemic Improvement Plan : Challenge and Opportunity for the.
State Systemic Improvement Plan: Preparing, Planning, and Staying Informed Presentation to Louisiana ICC July 10, 2013.
Rhode Island State Systemic Improvement Plan (SSIP) Stakeholder Input November 6, 2014.
SPP/APR/SSIP/SiMR Welcome to More Acronyms. Who is here? Introductions – who are you HERE? Your name cards are color coded by which group you represent.
OAPSA Fall Conference Sue Zake, Director of OEC September 26, 2014.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Massachusetts Department of Elementary & Secondary Education Overview of Results Driven Accountability Assuring Compliance and Improving Results August.
NC SSIP: 5 Things We’ve Learned Directors’ Update March 2015 ncimplementationscience.ncdpi.wikispaces.net/Recent+Presentations.
NC SSIP: Top 5 Things We’ve Learned Mid-South Meeting January 7-8, 2015.
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
The Center for IDEA Early Childhood Data Systems What Practitioners Need to Know about Measuring EI and ECSE Outcomes Kathleen Hebbeler, SRI International.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
Overview of the State Systemic Improvement Plan (SSIP) Anne Lucas, WRRC/ECTA Ron Dughman, MPRRC Janey Henkel, MPRRC 2013 WRRC Leadership Forum October.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Using State Data to Inform Parent Center Work. Region 2 Parent Technical Assistance Center (PTAC) Conference Charleston, SC June 25, 2015 Presenter: Terry.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
Overview of the State Systemic Improvement Plan (SSIP)
SHAME FEAR I AM NOT SEEN ACCESS I AM SEEN SYSTEMS CHANGE I AM A SPECIAL CITIZEN ACCOUNTABILITY and BUILD CAPACITY I BELONG AND MEANINGFUL LIFE EFFECTIVENESS.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
RESULTS-DRIVEN ACCOUNTABILITY IN SPECIAL EDUCATION Ann Moore, State Director Office of Special Education (OSE) January 2013.
Results Driven Accountability PRT System Support Grant Targeted Improvement Plan Cole Johnson, NDE.
SSIP Process A Suggested Pathway, Timeline and Gantt Chart WRRC Regional Forum Eugene October 31 and November 1, 2013.
RPTAC Region 4 Conference: Results-Driven Accountability Gregg Corr, Ed.D. Director Division of Monitoring and State Improvement Planning Office of Special.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
Georgia Parent Mentor Kickoff: Inform, Imagine, Inspire with Results-Driven Accountability Ruth Ryder DEPUTY DIRECTOR OFFICE OF SPECIAL EDUCATION PROGRAMS.
SHERRI YBARRA, SUPERINTENDENT OF PUBLIC INSTRUCTION SUPPORTING SCHOOLS AND STUDENTS TO ACHIEVE.
An Update of One Aspect of Monitoring, Support and Technical Assistance Available Through the State Department of Education, Bureau of Special Education.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Collaboration through State Systemic Improvement Planning: Working together to improve outcomes for young children with disabilities Division for Early.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Special Education State Performance Plan and Annual Performance.
LEA Self-Assessment LEASA: Presentations:
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Time for Change: Examining Utah Data Relating to Student Performance
Phase I Strategies to Improve Social-Emotional Outcomes
Rorie Fitzpatrick & Dona Meinders, WestEd
Office of Special Education
Using Formative Assessment
Title III of the No Child Left Behind Act
Kristin Reedy, Co-Director June 24, 2016
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
Webinar: ESSA Improvement Planning Requirements
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Early Childhood Outcomes Data (Indicator C3 and B7)
2018 OSEP Project Directors’ Conference
SPR&I Regional Training
Bureau of Family Health: Infant Toddler Services
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Integrating Results into Accountability Procedures and Activities
Measuring EC Outcomes DEC Conference Presentation 2010
Refresher: Background on Federal and State Requirements
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Christina Kasprzak Frank Porter Graham Child Development Institute
Student Success: Imagine the Possibilities
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Special Ed. Administrator’s Academy, September 24, 2013
Using Data to Build LEA Capacity to Improve Outcomes
Presentation transcript:

Region 1 PTAC Regional Conference Feeling Adrift in the Results Driven Accountability Seas? Strategies for Connecting with and Supporting Your State’s SSIP October 27, 2016

Session Objectives Build knowledge and understanding about RDA and Phases I, II, and III of the SSIP Explore state-specific SSIP plans and improvement strategies Discuss with colleagues connections between the work of your center and your state’s SSIP efforts

Results Driven Accountability (RDA) The primary focus of Federal and State monitoring activities shall be on… 1. Improving educational results and functional outcomes for all children with disabilities 2. Ensuring that States meet… the program requirements, with… emphasis on those most related to improving results Acknowledge that some of this will be review… 20 U.S.C. 1416(a)(2) Sec. 616(a)

Results Driven Accountability (RDA) OSEP’s Goal: BALANCE THE APPROACH! Prioritize improved outcomes for infants, toddlers, children and youth with disabilities and their families. OSEP, 10-5-12 RDA balances the focus on improved educational results and functional outcomes for SWD, on the one hand, with considering compliance as it relates to those results and outcomes, on the other. The SPP/APR is a critical component of RDA.

(RDA) Core Principles Partnership with stakeholders Transparent and understandable Drives improved outcomes Protection of individual rights Differentiated incentives Resources match to impact Responsive to ultimate consumers

Why RDA? Why Now? For over 30 years, there has been a strong focus on regulatory compliance with the IDEA and Federal regulations for early intervention and special education. OSEP States Districts/Programs As a result, compliance has improved!

Why RDA? Why Now? Despite this focus on compliance, we are still not seeing the outcomes for children with disabilities that we hope for and expect.

Elements of RDA Differentiated monitoring and support https://osep.grads360.org/#communities/pdc/documents/10319 Results focused determinations State Systemic Improvement Plan (SSIP) Suspended On-Site Compliance Monitoring Continued State Monitoring APR Reviews Fiscal Monitoring of Requirements Continued Technical Assistance (TA)

State Systemic Improvement Plan The State Systemic Improvement Plan (SSIP) is a comprehensive, multi-year plan that focuses on improving results for children with disabilities. The SSIP is reported in the state’s SPP/APR (Indicator 11 for Part C and Indicator 17 for Part B).

Implementation and Evaluation Stakeholder Engagement Year 1 – FFY 2013 Delivered by Feb 2015 Year 2 – FFY 2014 Delivered by Feb 2016 Years 3-6 FFY 2015-18 Feb 2017- Feb 2020 Phase I Analysis Phase II Plan Phase III Implementation and Evaluation Data Analysis; Identification of the state-identified measurable result (SIMR) Description of Infrastructure to Support Improvement and Build Capacity; Theory of Action Multi-year plan addressing: Infrastructure Development; Support EIS Program/LEA in Implementing Evidence-Based Practices; Evaluation Plan Reporting on Progress including: Results of Ongoing Evaluation Extent of Progress Revisions to the SPP SSIP Activities by Phase The dSSIP is to be developed in 2 phases and then implemented and evaluated in a third phase. These phases cover the new APR reporting period of 2015-2020 Phase I—FFY 2013 due to OSEP in February, 2015 Phase II—FFY 2014 due to OSEP in February, 2016 Phase III—FFY 2015-18 due to OSEP in February, 2017 through February, 2020 Phase I includes: Data Analyses Identification of the Focus for Improvement: Description of Infrastructure to Support Improvement and Build Capacity Theory of Action Phase II includes: Development of the Multi-year plan which includes: Infrastructure Development Supports for EIS programs/LEAs in Implementing Evidence-based practices Evaluation Plan Phase III includes: Evaluation of the plan and reporting of progress: Results of the Ongoing evaluation Extent of progress Revisions to the plan Stakeholder Engagement

SSIP Phase I Data Analysis Infrastructure Analysis State-Identified Measurable Result (SIMR) Coherent Improvement Strategies Theory of Action **Stakeholder Engagement**

SSIP Phase II Infrastructure Development Implementation of Evidence-Based Practices (EBPs) Evaluation **Stakeholder Engagement**

What do states’ SSIPs look like?

Part C and B Phase I and Phase II SSIP Analyses Analyses conducted by following TA centers: NCSI IDC NTACT ECTA DaSy

State-Identified Measurable Result (SIMR) A statement of the result(s) the State intends to achieve through the implementation of the SSIP. The State-identified result(s) must be clearly based on the Data and State Infrastructure Analyses and must be a child-level outcome in contrast to a process outcome.

States were required to analyze key data (SPP/APR, 618, other data) including: Review of disaggregated data Identification of data quality issues Identification of how data quality issues will be addressed Identification of compliance issues that are barriers

Phase I: Part C Data Sources Used Across Programs/Agencies All states used child and/or family outcomes data as the primary data source in their analyses for selecting a SIMR. This bar chart reflects the data sources the 56 reporting states accessed in addition to child and family outcomes data, including those sources outside of the Part C program/agency. Almost all states (98%) analyzed SPP/ARR data, while more than three quarters of the states (77%) accessed 618 data.

Phase I: SIMR Selected by Part B States

Phase I: SIMR Selected by Part C States FFY 2013 (2013-2014) Abby This map visually reflects which states selected the specific child and family outcomes. Dark green represents social relationships (C3A), light green represents knowledge and skills (C3B), and royal blue represents actions to meet needs (C3C). The checkered green states are those states that selected both social relationships and knowledge and skills. Light blue represents those states that chose all 3 child outcomes. For family outcomes, brown represents communicate children’s needs (C4B) and red represents help their child develop and learn (C4C). As you can see, seven states and territories are measuring clustered SIMRS. LA, ME, MN, NM, NY, OR, PA. Note: The Northern Mariana Islands are using a measurement that is not an indicator on the APR MP GU MH PW Legend Child Outcomes: C3A- Social relationships C3B- Knowledge and skills C3C- Meeting own needs C3A, B, and C VI FM Family Outcomes: C4A- Know their rights C4B- Communicate children’s need C4C- Help their children develop and learn Other- NY: All 3 Family Outcomes + other content; MP: Selected domains from assessment tool PR AS HI

Phase II: Revisions to Part B SIMRs Thirteen states out of 60 (22%) indicated making revisions to the SIMRs reported in Phase I. Some examples of the rationales that states provided for adjusting their SIMRs included: better alignment of the SIMR with the Elementary and Secondary Education Act (ESEA) Waiver/Every Student Succeeds Act (ESSA); changes in measurement of student achievement from Annual Measurable Objective (AMO) to the English Language Arts (ELA) state test; recommendations from OSEP and stakeholders to align with sites participating in Part C SSIP; combining the State Personnel Development Grant (SPDG) and SSIP; and changes in assessment procedures.

Phase II: Revisions to Part C SIMRs Seven (7) of 56 states Part C (13%) revised their SIMR baselines based on the following rationales: Two states revised their baselines because of improved data quality. One state revised because it is using a new measurement tool. One state revised because it changed the SIMR from a measure of the total population at exit to a measure of a subgroup of the population. One state revised to better align the baseline with its initiation of implementation. Two states did not provide rationales for changing their baselines.

Coherent Improvement Strategies and Selection of Evidence-Based Practices States must describe how they selected specific improvement strategies, why they are sound, logical, and aligned, and will lead to improvement in the SIMR. State must describe how implementation of the improvement strategies will address identified root causes for low performance.

Phase II: Part B States’ Selection of EBPs

Phase II: Part B States’ Approaches to Implementing EBPs States reported a variety of methods for determining which EBPs are to be implemented by LEAs. Eighteen states (30%) indicated selecting one or more EBPs that LEAs or school sites are required to implement, and 15 states (25%) identified several EBPs from which LEAs/school sites may choose to implement as part of the SSIP. (Italics added for emphasis.) In fewer instances, states indicated that LEAs/schools could choose EBPs based upon state-specified criteria (7 states, 12%) or could select EBPs without any state-specific criteria (4 states, 8%).

Phase II: Part B States’ Selected EBPs for Reading and Math SIMRs

Phase II: Part C States’ Selected EBPs

Phase II: Part C States’ Implementation of EBPs

Stakeholder Engagement

Phase I: Part B Stakeholder Engagement All states indicated external stakeholders were engaged in at least some aspect of the development of Phase I. External stakeholders represented over 20 different roles, titles, and organizations. Across all SSIPs, parents, Parent Training and Information Centers, parent advocacy organizations, and LEA representatives (e.g., administrators, educators, and related services personnel) were the most frequently identified participants and organizations. There were states that included other stakeholders such as students with disabilities, representatives of law centers, teacher unions, and civic organizations.

Phase I: Part B Stakeholder Engagement Most Frequent Participants Unique Participants Students with disabilities Law centers Teacher unions Civic organizations Parents Parent Training and Information Centers Parent advocacy organizations LEA representatives (e.g., administrators, educators, and related services personnel)

Phase I: Part B Stakeholder Engagement

Phase I: Part C Stakeholder Engagement

Phase I: Part C Stakeholder Engagement

Phase II: Part B Stakeholder Involvement in Implementation of EBPs States identified many offices within their SEAs, other state agencies, and other stakeholders who will be involved in scaling up and sustaining implementation of the EBPs once they have been implemented. SEA Part B staff (46 states, 77%), general education staff within the SEA (39 states, 65%) and LEA administrators (30 states, 50%) were noted most often as those who will be involved. Higher education personnel were identified in 21 states (35%). One state reported that teacher unions and the Parent/Teacher Association (PTA) would be involved. At least two states indicated that SPDG staff/coaches would also be a part of the process.

Phase II: Part C Stakeholder Involvement in Implementation of EBPs Figure 21 depicts the types of stakeholders that states reported would be involved in EBP implementation. At least 70% of the 56 states reported that the following stakeholders would be involved in the implementation of EBPs: EIS providers (73%), local program administrators (71%), family representatives (70%), and staff representing other state agencies (70%). Smaller but still substantial percentages of states reported that the following types of stakeholders would be involved: state Part C TA personnel (64%); higher education (61%); staff representing other programs within the lead agency (61%); ICC members, not specified by their role (57%); and representatives from the EC community (55%). Also mentioned in state reports were consultants/contractors (38%) and state legislators (5%).

Stakeholder Engagement in Phase III (OSEP Requirements) States are required to: Communicate SSIP revisions and rationale to stakeholders Disseminate and solicit information from stakeholders Address any concerns raised by stakeholders Involve stakeholders in the implementation and evaluation of SSIP activities

Discussion: Pair Share

Contact NCSI: Leadership NCSI Co-Directors Rorie Fitzpatrick rfitzpa@wested.org 415.615.3466 Kristin Reedy kreedy@wested.org 802.951.8218

https://ncsi.wested.org/ Contact NCSI https://ncsi.wested.org/

https://ncsi.wested.org/ Contact NCSI https://ncsi.wested.org/

Contact NCSI

THANK YOU! http://ncsi.wested.org | @TheNCSI