Presentation is loading. Please wait.

Presentation is loading. Please wait.

Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall.

Similar presentations


Presentation on theme: "Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall."— Presentation transcript:

1 Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall 2014

2 Results-based Monitoring Purpose: To monitor and support districts in the implementation of IDEA & ESEA programs that improve outcomes for students, while recognizing continuous improvement is necessary.  Shifting our focus from COMPLIANCE to PERFORMANCE 2

3 IDEA Monitoring Shift of Focus  IDEA compliance monitoring has produced 85-95% compliance in the areas of student records, parent notification, etc.  Student outcomes measured by academic achievement, graduation rate and dropout rate have not improved over time.  Focus is changing from strict compliance to student outcomes. 3

4 Continuous Improvement Cycle 4

5 CPM: Monitoring Desired Outcomes Streamlining process for districts Improve planning and monitoring around targeted accountability goals Cycle of Continuous Improvement for district improvement First Steps Consolidation of monitoring tools Creating and piloting a results-based tool and monitoring process Training and additional support for LEAs through CPM and Special Populations To monitor and support districts in the implementation of IDEA & ESEA programs that improve outcomes for students, while recognizing continuous improvement is necessary: Results based Monitoring Implemented 2014-15 School Year

6 Results-based Monitoring Tool 6

7 Process Overview  Why a results-based monitoring tool? Shifts focus from compliance to program effectiveness Encourages collaborative conversations around district programs Provides a better understanding of successes and challenges  How is the results-based monitoring tool organized? Based on factors influencing student outcomes Combines IDEA & ESEA monitoring items Adds an Improvement Plan focusing on suggested strategies to increase student outcomes Includes a Compliance Action Plan 7

8 Process Overview  Who is the Monitoring team? Lead: CPM Regional Consultant CPM Regional Consultant Staff representing Special Education and other critical subgroups and other areas (EL, non-public, etc.) Fiscal consultants when needed CPM Nashville staff when needed 8

9 Monitoring Process Overview  Step 1. TDOE identifies LEAs based on risk and notifies selected LEAs  Step 2. Data collection TDOE gathers assessment and growth data on districts and schools by proficiency levels, subjects and subgroups (SWD, ED, etc.) TDOE reviews LEA consolidated application and budget for ESEA and IDEA TDOE reviews LEA strategic plan TDOE requests LEA upload specific items  Step 3. TDOE selects schools based upon school assessment results and other factors At least two schools are selected for a two hour onsite visit Additional sites such as a preschool, non-public school, or program maybe also be selected 9

10 Monitoring Process Overview  Step 4. Phone call between TDOE and LEA to explain the process and clarify expectations TDOE wants to see and hear about day-to-day procedures LEA and School interviews Note: Preparing boxes of information is not necessary Agenda is negotiated with the LEA  Step 5. On-site visit approximately 3.5 days Entrance meeting (approximately one hour) with the district leadership and other relevant staff to review district strategic plan, initiatives, best practices, challenges, etc. Meetings with ESEA Director, IDEA Supervisor and program staff School visit schedules Time slots for TDOE to write comments and final report (at least two) Review and modify final report with ESEA Director, IDEA Supervisor Exit Conference with LEA leadership 10

11 Monitoring Process Overview  Step 6. Written Comments, Improvement Plan, Compliance Plan and LEA Letter Prior to the exit conference, the results-based monitoring comments are drafted The Improvement Plans and Compliance Action Plans are finalized in the exit conference (suggestions from the LEA are incorporated into the final document) Letter indicating Closed or Incomplete status is sent to the LEA within two weeks 11

12 Monitoring Process Overview  Documentation required for the visit is uploaded electronically.  Many documents reviewed on-site but no copies of documentation are expected.  Process relies heavily on interviews with LEA staff and listening to their procedures and challenges within their districts.  School site visits are conducted by meeting with the principal and school leadership then walking through classrooms and parts of the building.  TDOE staff writes up all comments, improvement plans and compliance action plans while in the district.  An on-site exit conference reviews the completed written monitoring instrument with LEA leadership. 12

13 Major Sections of the Monitoring Tool  Quality Leadership  Effective Teachers  Instructional Practices  Climate and Culture  Parent and Community Involvement  Appendices 13

14 14

15 15

16 16

17 17

18 District Selection: Risk Analysis 18

19 District Selection: Risk Analysis  Designed to target TDOE support where most needed  IDEA & ESEA monitoring selection based on risk analysis IDEA & ESEA indicators considered with more weight for program monitoring Fiscal indicators considered with more weight for fiscal monitoring TDOE may add additional on-site reviews on an as-needed basis  Informational Webinars to be scheduled for LEAs selected for monitoring

20 District Selection: Risk Analysis  An annual risk analysis identifies which LEAs are perceived to be at- risk based on various indicators.  The risk analysis includes programmatic, fiscal administrative, and achievement components of both IDEA and ESEA.  The programmatic risk analysis is weighted differently for program monitoring and fiscal monitoring.  There is often considerable overlap between program and fiscal risk, so many LEAs will receive both program and fiscal monitoring.  Two LEAs, not identified through the risk analysis, are randomly selected for results-based and fiscal monitoring.

21 District Selection: Risk Analysis Indicators  The following categories are included in the risk analysis:

22 District Selection: Risk Analysis Indicators  FISCAL Indicator 1: ESEA Title I Allocation > $500,000 {1 point} Indicator 2: ESEA Title I Allocation > $1,500,000 {1 point} Indicator 3: ESEA Title I Allocation > $5,000,000 {1 point} Indicator 4: ESEA Discretionary Grants FY14 {1 point per} Indicator 5: IDEA Part B Allocation > $500,000 {1 point} Indicator 6: IDEA Part B Allocation > $1,500,000 {1 point} Indicator 7: IDEA Part B Allocation > $3,000,000 {1 point} Indicator 8: IDEA Preschool Allocation > $75,000 {1 point} Indicator 9: IDEA Part B Discretionary Grants FY14 {1 point per} Indicator 10: IDEA Preschool Discretionary Grants FY14 {1 point per} Indicator 11: Title II-A Funds used for Class Size Reduction FY14 {1 point} 22

23 District Selection: Risk Analysis Indicators  FISCAL Indicator 12: ESEA Maintenance of Effort (MOE) Non-compliance FY13 {1 point each} Indicator 13: IDEA Maintenance of Effort (MOE) Non-compliance FY13 {1 point each} Indicator 14: ESEA & IDEA & CTE Potential Dropdead Funds Left to Draw > $0 FY13 (as of Aug. 1, 2014) {1 point each} Indicator 15: Title I Potential Carryover = or > 15% FY14 (as of Aug. 1, 2014) {1 point} Indicator 16: IDEA Part B Potential Excess Carryover = or > 50% FY14 (as of Aug. 1, 2014) {1 point per grant} Indicator 17: IDEA Preschool Potential Excess Carryover = or > 50% FY14 (as of Aug. 1, 2014) {1 point per grant} Indicator 18: Bookkeeper < 3 yrs exp w/ Federal Programs FY14 {1 point} 23

24 District Selection: Risk Analysis Indicators  PERSONNEL Indicator 19: ESEA Director < 3 yrs exp FY14 {1 point} Indicator 20: IDEA Director < 3 yrs exp FY14 {1 point} Indicator 21: Director of Schools < 3 yrs exp FY14 (NOT USED for FY14) {1 point}  MONITORING & AUDIT Indicator 22: ESEA Monitoring CAP 2013-14 {1 point per item} Indicator 23: IDEA Monitoring CAP 2013-14 {1 point per item} Indicator 24: Fiscal Monitoring CAP 2013-14 {1 point per item} Indicator 25: CAP Items Not Addressed in Timely Manner for 2012-13 {1 point per item} Indicator 26: TDOE Single Audit (A133) Findings 2013-14 {1 point per item} Indicator 27: US Ed Monitoring Findings 2013-14 {1 point per item} 24

25 District Selection: Risk Analysis Indicators  REPORTING DEADLINES Indicator 28: Missed Deadline ePlan Con. Funding App & Projected Budget FY15 {1 point} Indicator 29: Missed Deadline ePlan Final Budget FY14 {1 point} Indicator 30: Missed Deadline Comparability Title I and/or Not Compliant 2013-14 {1 point} Indicator 31: Missed Deadline Final Expenditure Report (NOT USED for FY14) {1 point} Indicator 32: Missed Deadline IDEA Dec. 1 Census Counts 2013-14 {1 point} Indicator 33: Missed Deadline IDEA EOY Statewide Frequency 2013-14 {1 point} Indicator 34: Missed Deadline IDEA SSEER Report (NOT USED for FY14) {1 point} Indicator 35: Missed Deadline IDEA Excess Cost Report (NOT USED for FY14) {1 point} 25

26 District Selection: Risk Analysis Indicators  COMPLAINTS & HEARINGS Indicator 36: ESEA Complaints 2013-14 {1 point per} Indicator 37: IDEA Complaints 2013-14 {1 point per} Indicator 38: IDEA Due Process Hearings 2013-14 {1 point per}  STUDENT RESULTS Indicator 39: LEA In Need of Improvement (INI) 2013-14 {1 point} Indicator 40: LEA In Need of Subgroup Improvement (INSI) 2013-14 {1 point} Indicator 41: LEA Exemplary 2013-14 {credit 1 point} Indicator 42: LEA Reward Schools 2014 {credit 1 point per} Indicator 43: LEA Priority Schools 2015 {1 point per} Indicator 44: LEA Focus Schools 2015 {1 point per} Indicator 45: LEA 3-8 Math All Students % Below Basic is > State % 2013-14 {1 point} Indicator 46: LEA 3-8 Reading All Students % Below Basic is > State % 2013-14 {1 point} 26

27 District Selection: Risk Analysis Indicators  STUDENT RESULTS Indicator 47: LEA Algebra I All Students % Below Basic is > State % 2013-14 {1 point} Indicator 48: LEA English II All Students % Below Basic is > State % 2013-14 {1 point} Indicator 49: LEA 3-8 Math Students w Disabilities % Below Basic is > State % 2013-14 {1 point} Indicator 50: LEA 3-8 Reading Students w Disabilities % Below Basic is > State % 2013-14 {1 point} Indicator 51: LEA Algebra I Students w Disabilities % Below Basic is > State % 2013-14 {1 point} Indicator 52: LEA English II Students w Disabilities % Below Basic is > State % 2013-14 {1 point} Indicator 53: LEA has Suspension Rates > State Rate 2013-14 (NOT USED for FY14) {1 point per subgroup} 27

28 District Selection: Risk Analysis Indicators  STUDENT RESULTS Indicator 54: LEA has Students w Disabilities Grad Rate < State Rate 2011-12 {1 point per subgroup} Indicator 55: LEA IDEA Least Restrictive Environment (LRE) (% SWD in GenEd Setting = or > 80% of Day) 2013-14 {1 point} Indicator 56: LEA IDEA Disproportionate Representation of SWD by race/ethnicity 2012-13 {1 point}  ADDITIONAL RISK CONCERNS Indicator 57: CPM Office - TDOE Concerns of Additional Risk {5 points} Indicator 58: IDEA Program - TDOE Concerns of Additional Risk {5 points} Indicator 59: Fiscal - TDOE Concerns of Additional Risk {5 points} 28

29 2014-15 IDEA & ESEA Results-based & Joint Fiscal Monitoring TDOE Consolidated Planning & Monitoring

30 Pilot Survey Results 30

31 Participating LEA Demographics 31 Six districts and twelve schools volunteered for the monitoring pilot:

32 Relevant Survey Results 32

33 33 FRAUD, WASTE or ABUSE Citizens and agencies are encouraged to report fraud, waste or abuse in State and Local government. NOTICE: This agency is a recipient of taxpayer funding. If you observe an agency director or employee engaging in any activity which you consider to be illegal, improper or wasteful, please call the state Comptroller’s toll-free Hotline: 1-800-232-5454 Notifications can also be submitted electronically at: http://www.comptroller.tn.gov/hotline

34


Download ppt "Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall."

Similar presentations


Ads by Google