1 Title I, Part D Data: SY 2012−13 Data Preview, Data Quality, and Upcoming CSPR Clarifications Dory Seidel and Jenna Tweedie, NDTAC.

Slides:



Advertisements
Similar presentations
ARRA Reporting School Level Expenditure Report February 12, 2010 (SLER)
Advertisements

December 9, 2006 Council of Graduate Schools 1 Ph.D. Completion Project: Tools and Templates 46 th CGS Annual Meeting Technical Workshop.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop MICS4 Technical Assistance.
Collecting and Using Post-School Outcome Data New Mexico Cadre Summer Camp June 11-12, 2007.
Procedures for ESEA Consolidated Monitoring Effective July 1, 2011 – June 30, 2014 Monitoring For Results.
1 Title I Hiring Requirements for Paraeducators and Parental Notification of Teacher and Paraeducator Qualifications Regional Technical Assistance Sessions.
3/2/ STANLEY LIGAS, et al. v. JULIE HAMOS, et al. First Annual Report of the Monitor September 27, 2012 Tony Records, Monitor
1 Targeted Case Management (TCM) Changes Iowa Medicaid Enterprise October 14, 2008.
Title I Site Eligibility Ranking & Serving Schools NCLB Technical Assistance Audio March 28, :30 PM March 29, :30 AM Alaska Department of.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA September 2003.
Federal Accountability/ AYP Update Special Education TETN January 6, 2010 Shannon Housson and Ester Regalado TEA, Performance Reporting Division.
0 - 0.
Title I, Part A and Section 31a At Risk 101
Addition Facts
Guide to Compass Evaluations and
Webinar: June 6, :00am – 11:30am EDT The Community Eligibility Option.
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
Test Accommodations Students with Disabilities 2013 Presented by Janice Koblick, Curriculum Supervisor Exceptional Student Education 1.
Introduction to Homeless Management Information Systems (HMIS)
New Employee Orientation: Performance Management
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Title I, Part A Targeted Assistance 101 Field Services Unit Office of School Improvement.
Strategic Planning Programming Update September 19,
Addition 1’s to 20.
1 Welcome to the Title I Annual Meeting for Parents
Test B, 100 Subtraction Facts
Week 1.
WEB IEP FOLLOW-UP ECO GATHERED FOR BIRTH TO 5 INCLUDING INFANT, TODDLER, PK 1.
NRS Follow-up Outcome Changes Effective July 1, 2012.
Module 2.4 Workforce Transition Planning and Tool March
Highlights From the Survey on the Use of Funds Under Title II, Part A
Presented by Lawrence Dennis Education Consultant for the Office for Exceptional Children October 23, 2014 OCTA Fall Conference.
SMART GOALS APS TEACHER EVALUATION. AGENDA Purpose Balancing Realism and Rigor Progress Based Goals Three Types of Goals Avoiding Averages Goal.
Support Professionals Evaluation Model Webinar Spring 2013.
High Quality Child Outcomes Data in Early Childhood: More Important than Ever Kathleen Hebbeler, SRI International Christina Kasprzak, Frank Porter Graham.
1 Title I, Part D Data Reporting and Evaluation: What You Need To Know Dory Seidel and Jenna Tweedie, NDTAC Karen Neilson, California Department of Education.
1 ND Community Call Salmon Community 21 October 2014.
Strategies for Developing Efficient and Effective Annual Count Processes Stephanie Lampron, DeAngela Milligan, and Marcia Calloway.
Reporting & Evaluation Workshop Lauren Amos, Liann Seiter, and Dory Seidel.
Taking the Fast Lane to High-Quality Data Sarah Bardack and Stephanie Lampron.
Title I, Part D and the Annual Count: Understanding the Grant and the Count Process.
ND Community Call Salmon Community October 23, 2013.
1 Literacy and Numeracy Gains Webinar February 3, :00 am - 11:00 am.
Title I-D, Subpart 2 Neglected, Delinquent, and At- Risk Youth Program ESEA Odyssey Summer 2013 Russ Sweet Education Specialist Oregon Department of Education.
1 Topical Call Series: Improving Data Quality and Use Improving Data Use Wednesday, November 19, 2014.
Meeting the Educational Needs of Diverse Learners DeAngela Milligan and Sarah Bardack.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 3) 22 September 2015 – Katie Deal.
CTE Data and EDFacts September 22 & 23, Consolidated Annual Report (CAR) For , due December 31, 2010 – Either CAR or EDFacts, not both –
1 Topical Call Series: Improving Data Quality and Use CSPR Data Collection Tuesday, September 15, 2015.
Melvin L. Herring, III Program Director, Title I, Part D Florida Department of Education.
The Power of Monitoring: Building Strengths While Ensuring Compliance Greta Colombi and Simon Gonsoulin, NDTAC.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director.
MIS DATA CONFERENCE 2012 JULY 23, 2012 Mississippi Department of Education Office of Federal Programs.
Annual Counts: Understanding the Process and Its Implications.
TITLE I, PART D STATE PLANS John McLaughlin Federal Coordinator for the Title I, Part D Program NDTAC Conference May
July 2009 Copyright © 2009 Mississippi Department of Education State Performance Plan Annual Performance Report Indicators 8, 11, 12, 13, and 14 July 2009.
Overview of the Counting Process DeAngela Milligan.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 2) 26 August 2015 – Katie Deal.
Consolidated State Performance Report & Survey to Generate Title I Neglected and Delinquent Funds for Subpart 2 LEAs and TACF Neglected,
Consolidated State Performance Report & Survey to Generate Title I Neglected and Delinquent Funds for Subpart 1 State Agencies Neglected,
1 New Coordinator Orientation Lauren Amos, Katie Deal, and Liann Seiter.
Proposed Changes to the Title I, Part D, Federal Data Collection As of June 28, 2012.
1 ND Community Call Teal Community 27 October 2015.
Using Pre- and Posttesting To Improve Programming and Student Achievement Anju Sidana.
1 ND Community Call Gold Community 22 October 2015.
1 Effectively Addressing Administrative Challenges of Implementing Title I, Part D Katie Deal, Rob Mayo, Liann Seiter, and Jake Sokolsky.
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
PoPs and MSGs How reporting will be affected in 17/18 based on new Period of Participation and Measurables Skills Gain tracking.
Using Data to Monitor Title I, Part D
Presentation transcript:

1 Title I, Part D Data: SY 2012−13 Data Preview, Data Quality, and Upcoming CSPR Clarifications Dory Seidel and Jenna Tweedie, NDTAC

2 Data Quality Overview

3 Why Is Data Quality Important? Trusting your data is important for informing: Funding and other decisionmaking Technical assistance (TA) needs Subgrantee monitoring Student programming

4 Factors Affecting Data Quality Clarity of what to report/definitions Changes to items/collection Data culture (e.g., data use, nonuse) Dedicated time/staff for data collection and reporting Staff turnover and change in staff roles Data reporting/collection system(s)

5 Role of the Part D Coordinator Ultimately, coordinators cannot “make” the data be of high quality, but you can implement systems that make it more likely by: Understanding the collection process Providing TA in advance Developing relationships Developing multilevel verification processes Tracking problems over time Using the data Linking decisions (funding, hiring, etc.) to data evidence

6 Sample State Data Trends

7 Data Quality Discussion

8 Clarifications to the Consolidated State Performance Report (CSPR) for SY 2013–14

9 Clarifications for Transition Services (Tables and , File Spec 182) In the first row of the table below, indicate whether programs/facilities receiving Subpart 1 funds within the State are legally permitted to track student outcomes after leaving the program or facility by entering Yes or No. In the second row, provide the unduplicated count of students receiving transition services that specifically target planning for further schooling and/or employment. If not, provide more information in the comment field. FAQ on facilities collecting data on student outcomes after exit: If only some, but not all, facilities in the State can collect data on student outcomes after exit, enter “yes” for the first question and provide a comment indicating why some facilities are unable to collect these data.

10 Clarifications for Academic and Vocational Outcomes (Tables & , File Specs 180 & 181) In the table below, for each program type, first provide the unduplicated number of students who attained academic and vocational outcomes while enrolled in the State agency program/facility and next provide the unduplicated number of students who attained academic and vocational outcomes within 90 calendar days after exiting. If a student attained an outcome once in the program/facility and once during the 90-day transition period, that student may be counted once in each column separately as appropriate. For “Enrolled in their local district school” use the “90 days after exit” columns to provide the number of students who enrolled, or planned to enroll, in their local district school after exit.

11 Clarifications for Academic Performance in Reading and Mathematics (Tables & , File Specs 113 & 125) In the table below, provide the unduplicated number of long-term students served by Title I, Part D, Subpart 1 who participated in reading pre- and posttesting. Students should be reported in only one of the four change categories. Report only information on a student’s most recent testing data. Students who were pretested prior to July 1, 2013, may be included if their posttest was administered during the reporting year. Students who were posttested after the reporting year ended should be counted in the following year. Below the table is an FAQ about the data collected in this table.

12 Change: File Spec 135 Eliminated Academic Performance in Reading/Mathematics (Tables & ) Of the students reported in row 2 above, indicate the number who showed: *The unduplicated number of long-term students will no longer be calculated via the academic performance files, but rather entered as a number (like students served) through the participation file specifications—File specs 119 and 127.

13 Clarifications to the CSPR for SY 2014–15

14 SY 2014–15 Outcome Table Updates (1) (Tables & , File Specs 180 & 181) The table of academic and vocational outcomes is reorganized into three smaller tables to group outcomes by (1) the setting in which the outcomes are achieved (in facility vs. out of facility) and (2) how many times students can achieve them. The instructions for the tables have been altered to reflect the new groupings: The first table includes outcomes a student can achieve only after exit. In this table, provide the unduplicated number of students who enrolled, or planned to enroll, in their local district school within 90 calendar days after exiting. A student may be reported only once, per program type.

15 Academic and Vocational Outcomes (2) The second table includes outcomes a student can achieve only one time. In this table, provide the unduplicated number of students who attained the listed outcomes either while enrolled in the LEA program/facility column (“in fac.”) or in the “90 days after exit” column. A student may be reported only once across the two time periods, per program type.

16 Academic and Vocational Outcomes (3) The third table includes outcomes that a student may achieve more than once. In the “in fac.” column, provide the unduplicated number of students who attained academic and vocational outcomes while enrolled in the LEA program/facility. In the “90 days after exit” column, provide the unduplicated number of students who attained academic and vocational outcomes within 90 calendar days after exiting. If a student attained an outcome once in the program/facility and once during the 90-day transition period, that student may be reported once in each column.

17 NDTAC’s CSPR Collection Tool

18 NDTAC’s CSPR Collection Tool (cont. 1) Online collection tool for Subpart 1 and Subpart 2 SEAs collect data from subgrantees

19 NDTAC’s CSPR Collection Tool (cont. 2) Updated for SY 2013–14 Customizable for individual State collections Survey Gizmo –SEA fee: $75/month or $810/year Contact NDTAC if you are interested