1 2007 OSEP National Early Childhood Conference December 2007.

Slides:



Advertisements
Similar presentations
Compliance Monitoring Orientation. Monitoring Components Focus Site Review/Fiscal Monitoring SPAM.
Advertisements

(Individuals with Disabilities Education Improvement Act) and
Updates on APR Reporting for Early Childhood Transition (Indicators C-8 and B-12)
Early Childhood Special Education Part B, Section 619* Part C to B Transition by Three Jessica Brady, Noel Cole Michigan Department of Education Office.
From Referral Through Exit
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
Early Childhood Transition Forums Sponsored by the Massachusetts Department of Early Education and Care, Department of Elementary and Secondary Education,
File Review Activity Lessons learned through monitoring: Service areas must ensure there is documentation supporting the information reported in the self-
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
1 Determinations EI/ECSE SPR&I Training ODE Fall 2007.
Correction of Non-Compliance Prior to Notification Monitoring and Supervision March 11, 2013.
Potpourri: Summary of Important Points to Remember Presenters: Jill Harris Laura Duos NOVEMBER 2011.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Special Ed. Administrator’s Academy, September 24, 2013 Monitoring and Program Effectiveness.
CHRISTINA SPECTOR WENDI SCHREITER ERIN ARANGO-ESCALANTE IDEA Part C to Part B Transition.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
Verification Visit by the Office of Special Education Programs (OSEP) September 27-29, 2010.
OSEP National Early Childhood Conference December 2007.
Systems Performance Review & Improvement (SPR&I) Training Oregon Department of Education Fall 2007.
Objectives: 1) Participants will become familiar with General Supervision Monitoring Plan Section of the Kansas Infant Toddler Services Procedural Manual.
2014 ALACASE CONFERENCE Preschool Indicators 2014 EI Preschool Conference.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
1 Using a Statewide Evaluation Tool for Child Outcomes & Program Improvement Terry Harrison, Part C Coordinator Susan Evans, Autism Project Specialist.
A Review of the Special Education Integrated Monitoring Process BIE Special Education Academy September 12-15, 2011 Tampa, Florida.
Welcome to the Regional SPR&I trainings Be sure to sign in Be sure to sign in You should have one school age OR EI/ECSE packet of handouts You.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Sarah Walters - Part C Coordinator KDHE Tiffany Smith - Part B ECSE Coordinator KSDE 1.
1 Accountability Conference Education Service Center, Region 20 September 16, 2009.
~ Part C Dispute Resolution ~ If It Ain’t Broke, How Will We Know? (National DR Data and An Examination of One State System) Dick Zeller & Marshall Peter,
1 General Supervision. 2 General Supervision (and Continuous Improvement) 1.What are the minimum Components for General Supervision ? 2.How do the Components.
1 Charting the Course: Smoother Data Sharing for Effective Early Childhood Transition Wisconsin’s Journey Lori Wittemann, Wisconsin Department of Health.
An Introduction to the State Performance Plan/Annual Performance Report.
State Performance Plan (SPP) Annual Performance Report (APR) Dana Corriveau Bureau of Special Education Connecticut State Department of Education ConnCASEOctober.
Letter of Explanation Copy of Data Disproportionality Initial Eligibility 60-day Timeline Early Childhood Transition Secondary Transition Corrected and.
Understanding Levels of Determination—Part B (CFR and 604) Improving Performance to Increase Positive Results Eugene R. Thompson, Education Program.
Massachusetts Part C Department of Public Health (LA) 62 programs, 38 vendor agencies 6 Regions 6 Regional Specialists.
Arizona Early Intervention Program (AzEIP) Team-Based Early Intervention Services Overview for Administrators ADMINISTRATIVE.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
Improvement Planning Mischele McManus Infant/Toddler and Family Services Office of Early Childhood Education and Family Services July 20, 2007
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
CT Speech Language Hearing Association March 26, 2010.
1 Transition: Part C to Part B Infant & Toddler Connection of Virginia Spring/Summer 2007.
Texas State Performance Plan Data, Performance, Results TCASE Leadership Academy Fall 2008.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Noncompliance and Correction (OSEP Memo 09-02) June 2012.
Using Data for Program Improvement State and Local Activities in Minnesota Lisa Backer: 619 Coordinator/Part C Data Manager Loraine Jensen: Part C Coordinator.
1 Title IA Coordinator Training Preparing for Title IA Monitoring
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
In accordance with the Individuals with Disabilities Education Act and Chapters 14 and 15 of the State Board Regulations, PDE provides general supervision.
Vanessa Winborne Kelly Hurshe Colleen O’Connor EARLY ON ® UPDATES.
Special Education Performance Profiles and SPP Compliance Indicator Reviews Office for Exceptional Children.
Early Intervention Colorado TA Call December 5, 2013 Annual Performance Report.
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
6/18/2016 DES / AzEIP 2011 Cycle Two Self Report Overview & Training Cycle Two Self Report Overview & Training.
What’s New for Transition to Special Education Services? Paula E. Goff, Part C Coordinator May 23, 2013.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
Understanding the Data on Preschool Child Find and Transition Annual Performance Report Indicator 12 February, 2016
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
Module 3 Early ACCESS Process Section 3 Evaluation and Assessment Iowa Department of Education.
ESEA Consolidated Monitoring
Data System Features that Enhance General Supervision
Monitoring Child Outcomes: The Good, the Bad, and the Ugly
SPR&I Regional Training
Early Childhood Transition APR Indicators and National Trends
The Process for Final Approval: Ongoing Monitoring
Special Ed. Administrator’s Academy, September 24, 2013
Presentation transcript:

OSEP National Early Childhood Conference December 2007

2 NJEIS GENERAL SUPERVISION SYSTEM

3 NJEIS FUNDING TRENDS

4 Central Management Office (Data Collection) Data Desk Audit & Inquiry Self-Assessment Focused On-site Monitoring Targeted Technical Assistance Procedural Safeguards/Dispute Resolution Enforcement GENERAL SUPERVISION NJEIS COMPONENTS

5 NJEIS INFRASTRUCTURE Lead Agency-Quality Assurance Team   Contracts   Procedural Safeguards   Central Management Office   Monitoring   Personnel Development Regional Early Intervention Collaboratives (REICs - 4) Service Coordination Units (SCUs - 21) Early Intervention Programs (EIPs - 80+)/Practitioners (3500+)   Targeted Evaluation Teams   Comprehensive Programs   Service Vendors

6 GENERAL SUPERVISION ACTIVITIES TO ENSURE COMPLIANCE To ensure ongoing compliance and timely response to emerging issues data are reviewed periodically as follows:   Service Coordination Units (County SPOE) review weekly; and   Regional Early Intervention Collaboratives review monthly. NJEIS provides technical assistance as appropriate if issues are identified through ongoing reviews.

7 CENTRAL MANAGEMENT OFFICE (CMO)

8 COVANSYS CMO The Covansys System is designed specifically for early intervention It has been in use for over 10 years. Currently there are 4 States using the system who are actively developing and making improvements to the software

9 CMO FEATURES Child Specific Data Collection State access to timely statewide data Local Access to Data Data Verification (Accuracy) Provides Accountability Timely system of payment Maximization of funding resources Supports Monitoring Personnel Enrollment/Matrix Reports

10 CHILD SPECIFIC DATA COLLECTION 45 days 1 year 1 year 1 year 45 days 1 year 1 year 1 year ReferralInitial IFSP Annual IFSP Annual IFSP Transition Entry Exit Timely Services

11 ACCOUNTABILITY FEATURES Child must be eligible for Early Intervention Child must have an Active IFSP to receive authorization for services Practitioners must pass a credentialing process where their experience and licenses are verified Explanation of Benefits sent to family Billing authorizations are created based on a completed IFSP and ensure services are being provided according to the IFSP

12 DATA VERIFICATION REICs are responsible for entering the IFSP information into SPOE which provides on going accountability (Oops tickets). Paperflow REIC Data entry Funky Data Inquiry Process Service and practitioner’s specialty is matched to ensure practitioner is qualified Data Verification On-site Visit

13 CMO SUPPORTS MONITORING The CMO provides standard and customized reports that support:   Federal reporting requirements including 618, SPP, and APR   Quality assurance of Federal and State performance and compliance requirements   Analysis of child outcome data   Tracking progress improvement and correction   Reporting to state, stakeholders and public

14 CMO REPORTS Days from Referral to Initial IFSP (percent of IFSP that took place >45 days from the referral date). Days from IFSP Meeting to start of services (percent of services starting >30 days from the IFSP meeting date). Percent of IFSP services provided in other than natural environment. Frequency of periodic reviews, and that the review takes place within 6 months of the IFSP Start Date. The timing of Annual IFSP meetings and the percent that exceed the 12 months. Transition Planning Conference within 90 days of a child turning 3 years old. Children that have exited the system and the reason they transitioned out of NJEIS.

15 DATA DESK AUDIT, INQUIRY & CORRECTIVE ACTION PLAN

16 IDENTIFICATION & CORRECTION OF NONCOMPLIANCE Review of Data Agency Performance Inquiry (Off-site) Analysis of Agency Response to Inquiry Lead Agency Determination of Noncompliance Development of Corrective Action Plan (CAP) Correction of Noncompliance within 12 months

17 OFFSITE PERFORMANCE INQUIRY Existing performance data on required indicator are sent to agencies by state office; Agency reviews the data and responds to a series of questions in 10–15 business days   Data verification (clean-up missing/incorrect information (dates)   What was the reason for each delay?   Has the delay since been corrected?   What barriers contribute to the poor performance?   What was the response and/or correction to barriers   What is being done to improve performance?

18 DATA VERIFICATION (Clean-up) Monitoring Team Data Review Matrix   Lead Agency Data Verification   Regional Data Clean-up   Lead Agency Desk Audit   Additional local data verification & clean-up during inquiry

19 NJEIS data desk audits are conducted annually by lead agency staff to monitor timely transition planning conference timelines.   A TPC timeline data run of all children turning three is conducted on all twenty-one counties.   Additional information is obtained as necessary from county agencies through an inquiry process.   NJ’s twenty-one counties are ranked based on co-hort size (small, medium, large).   Findings of non-compliance are determined and corrective action plans (CAPs) are developed including required evidence of change.   NJEIS provides technical assistance, monitors correction of the non-compliance and ensures correction within one year of identification of the non- compliance to the county. Each year NJEIS identifies focused areas for on-site monitoring based on statewide compliance and performance data.

20 Desk Audit Transition Planning Conference (TPC)

21 DETERMINATION OF NONCOMPLIANCE Inquiry data are reviewed by state office looking at agency submitted reasons for delay. Agency is not held accountable for delays related to family reasons. State determines if agency should be issued a finding of noncompliance. Is yes, agency is notified of noncompliance with federal requirement.

22 Data Desk Inquiry

23 Data Desk Inquiry (continued)

24 TPC ( ) State Performance The desk audit identified 76.1% of the child records reviewed received a TPC. After inquiry to the local counties, 96% of these child records documented that a TPC meeting occurred within 90 days prior to the child’s third birthday. Inquiry identified missing or incomplete data in the database, family reasons for delays or decline from having TPC.

25 TPC ( ) Local County Performance A desk audit for one county identified that 60.4% of the children reviewed had a TPC meeting. After inquiry to this county, 85% of these child records documented that a TPC meeting occurred within 90 days prior to the child’s third birthday. Inquiry identified missing or incomplete data in the database, family reasons for delays or decline from having TPC. A CAP was issued to program to correct noncompliance

26 LOCAL AGENCY CAP INCLUDES Required evidence of change with dates for required reporting Activities to help with improvement TA available as needed Change is required!!!! Letter sent to agency as soon as CAP is successfully completed

27 CAP REQUIRED EVIDENCE OF CHANGE Baseline – 85% of children had a TPC 90 days prior to their third birthday   Target Day 30 – 85% of children   Target Day 60 – 90% of children   Target Day 90 – 95% of children   Target Day 120 – 100% of children

28 Corrective Action Plan (CAP)

29 COMPLIANCE NOT ACHIEVED WITHIN ONE YEAR OF FINDING If an agency completing the monthly data requirements, does not demonstrate 100% compliance near the 12 month timeline, the agency is required to submit written explanation of why 100% compliance was not achieved. The lead agency reviews, requests further information as needed and issues additional sanctions. Sanctions include:   Ongoing monthly reporting.   Identification of measurable activities that will impact on improvement toward correction.   State directed training and/or technical assistance.   The lead agency may request an on-site focused visit.   Place the agency under “at risk” or “special conditions”.   Reduce or withhold funding.

30 SELF-ASSESSMENT

31 SELF-ASSESSMENT DHSS-NJEIS contracts include a requirement that NJEIS provider agencies submit an annual self- assessment report. Facilitates supervision through monthly record review and practitioner observation requirements. Provides transition indicators 8a and 8b data currently unavailable through the CMO data system. Self identified improvement planning is expected to proactively remedy performance concerns. Timely and accurate reporting is tracked for local performance reporting and determination.

32 ON-SITE FOCUSED MONITORING

33 ONSITE MONITORING Decisions to conduct on-site focused monitoring visits may be made under the following circumstances   As needed, based on incident reports or procedural safeguards complaints;   As needed, based on concerns identified through on-going review of system point of entry (SPOE) or self-assessment data; and   Based on ranked performance data related to priority indicators.