0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,

Slides:



Advertisements
Similar presentations
New Opportunities in the Special Education Space NEKIA Business Development Institute November 17, 2005 Jane West, Ph.D. Washington Partners, LLC.
Advertisements

1 Budgeting for Performance in the U.S. Using the Program Assessment Rating Tool J. Kevin Carroll U.S. Office of Management and Budget July 2008.
Special Education Director’s Conference Sept. 29, 2006 Prepared by Sharon Schumacher.
High-Quality Supplemental Educational Services And After-School Partnerships Demonstration Program (CFDA Number: ) CLOSING DATE: August 12, 2008.
Title III National Professional Development (NPD) Program Grantee Performance Reporting: A Webinar for FY2011 and FY2012 Grantees February 28, 2013 Prepared.
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
Refresher: Background on Federal and State Requirements.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
GRANTEE REPORTING REQUIREMENTS USING THE PERSONNEL DEVELOPMENT PROGRAM DATA COLLECTION SYSTEM (DCS) DR. BONNIE D. JONES, OSEP DR. SHEDEH HAJGHASSEMALI,
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
325K: Combined Priority for Personnel Preparation Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D Preparation of Leadership Personnel Webinar on the Annual Performance Report for Continuation Funding Presenter: Patricia Gonzalez, Ph.D. Office.
Mathematics/Science Partnerships U.S. Department of Education: New Program Grantees.
Welcome! P ROGRAM M EETING B ONNIE D. J ONES, P ROGRAM L EAD J ULY 21, 2014 OSEP Personnel Development Program 7/21/2014.
Are We making a Difference
Karen Schroll, Westat Dr. Bonnie Jones, OSEP July 21,
1 Update on the Personnel Preparation Program Student Data Report: Purpose, Results, and Issues Dr. Bonnie Jones, OSEP Karen Schroll, Westat Dr. Marsha.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 3) 22 September 2015 – Katie Deal.
BPR PHASE IV IMPLEMENTATION NOAA Business Process Re-Engineering Current Status: June 21, 2006 NESDIS Cooperative Institute Directors and Administrators.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING FY 2009.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Revisions to the OSEP PPD Student Data Report for FY2006 July 18, 2007 Karen Schroll, Westat.
ESEA Consolidated Monitoring Office of Federal Programs December 10, 2013.
Leveraging Federal Resources: Teacher Quality, Research, and Program Improvement Peggi Zelinko Office of Innovation and Improvement (OII) Robert Ochsendorf.
Welcome! PROGRAM MEETING July 23, 2012 OSEP Personnel Development Program 1.
The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting.
Special Education Teacher Preparation Program Improvement Grants (325T) Bonnie D. Jones, Ed. D. Education Program Specialist Office of Special Education.
OSEP Project Directors’ Conference Managing Your Grants 101 Terry Jackson – OSEP Shedeh Hajghassemali – OSEP July 22, 2008.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
Parent and National TA Perspectives on EC Outcomes Connie Hawkins, Region 2 PTAC Kathy Hebbeler, ECO at SRI Lynne Kahn ECO at FPG and NECTAC.
OFFICE OF ENGLISH LANGUAGE ACQUISITION NATIONAL PROFESSIONAL DEVELOPMENT PROGRAM (NPD) NPD Grant Competition Webinar 2: GPRA & Selection Criteria January.
Overview of the FY 2011 SPDG Competition Jennifer Coffey, Ph.D. State Personnel Development Grants Program Lead 1.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
OVERVIEW OF FEDERAL PERFORMANCE MANAGEMENT Community-Based Child Abuse Prevention (CBCAP) and Promoting Safe and Stable Families (PSSF) Grantees Meeting.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Enhancing Education Through Technology ( EETT/Title II D) Competitive Grant Application Technical Assistance Workshop New York State Education Department.
PERSONNEL DEVELOPMENT PROGRAM Webinar for 325D and 325K Grantees Completing the ED Grant Performance Report (ED 524B) for the Annual Performance.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
PRESENTATION OF FINDINGS GRANTEES NEED ASSESSMENT
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Preparation of Special Education, Early Intervention, and Related Services Leadership Personnel (84.325D) Office of Special Education Programs U.S.
Early Childhood Outcomes Data (Indicator C3 and B7)
2018 OSEP Project Directors’ Conference
OSEP Initiatives on Early Childhood Outcomes
Grantee Guide to Project Performance Measurement
The Program Assessment Rating Tool (PART)
Early Childhood and Family Outcomes
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Response to Intervention in Illinois
Measuring EC Outcomes DEC Conference Presentation 2010
Refresher: Background on Federal and State Requirements
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Presentation transcript:

0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation, and Policy Development Bonnie Jones Office of Special Education Programs July 18, 2007

1 Program Assessment Rating Tool (PART) Developed by the Office of Management & Budget – BPI Council. Designed to - Provide a systematic method of assessing the performance of Federal programs across agencies Provide a systematic method of assessing the performance of Federal programs across agencies Create a better way of linking budget decisions to performance (e.g. Budget Performance Integration) Create a better way of linking budget decisions to performance (e.g. Budget Performance Integration) Incorporate a review of the broad array of factors that affect program outcomes Incorporate a review of the broad array of factors that affect program outcomes Build on and improve GPRA performance reporting & performance management Build on and improve GPRA performance reporting & performance management

2 Structure of the PART Asks a series of yes/no questions, 4 sections Asks a series of yes/no questions, 4 sections I. Program purpose & design I. Program purpose & design II. Strategic Planning II. Strategic Planning III. Program Management III. Program Management IV. Program results and accountability IV. Program results and accountability 25 basic questions, plus selected sets of questions tailored to specific types of programs (formula, discretionary, direct service, R&D, etc.) 25 basic questions, plus selected sets of questions tailored to specific types of programs (formula, discretionary, direct service, R&D, etc.) Weights are assigned to each section and question Weights are assigned to each section and question

3 How does a program earn a high PART rating? Must have meaningful program outcome objectives, indicators, and efficiency measures Must have meaningful program outcome objectives, indicators, and efficiency measures Must collect performance data, have baselines and ambitious targets Must collect performance data, have baselines and ambitious targets Must use data to manage program Must use data to manage program

4 Scores Translated into PART Ratings Rating (number of ED programs) Range Effective (4) Moderately Effective (7) Adequate (26) Ineffective (7) 0-49 Results Not Demonstrated (programs lacking measures/data) (45)

5 Are PARTs and scores public? Every program that has been reviewed using the PART has a summary at: Every program that has been reviewed using the PART has a summary at: Summaries link to detailed program assessments Summaries link to detailed program assessments Additional resources for PART, including detailed instructions for every question (PART Guidance) may be found at: Additional resources for PART, including detailed instructions for every question (PART Guidance) may be found at:

6 So what happens after the PART? Agencies are held accountable for implementing PART follow-up actions, also known as improvement plans, for every PARTed program Agencies are held accountable for implementing PART follow-up actions, also known as improvement plans, for every PARTed program Follow-up actions are program specific Follow-up actions are program specific Generally include actions like developing better measures, collecting better data, or developing a plan for evaluating the program. Generally include actions like developing better measures, collecting better data, or developing a plan for evaluating the program.

7 National Activities Program (Part D) StudentResults Technology State Personnel Development Grants Personnel Development Technical Assistance, Model Demonstration, Dissemination, & Implementation Parent Training & Information Parent Training & Information

8 OSEP Accountability Framework PROGRAM PERFORMANCE MEASURES PROJECTDATAPROJECTDATA SCHOLARDATA PERSONNELDATA

9 OSEP Accountability Framework PROGRAM PERFORMANCE MEASURES PROJECTDATAPROJECTDATA SCHOLARDATA PERSONNELDATA Student Data Collection Collection Annual Reports Service Obligation

10 Measure 1 The percentage of Special Education Personnel Preparation projects that incorporate evidence-based practices in the curriculum Annual review of syllabi by expert panel for all new grantees. Annual review of syllabi by expert panel for all new grantees. Leadership Leadership High-Incidence (beginning in FY07 submitted at end of Year 1) High-Incidence (beginning in FY07 submitted at end of Year 1) Low Incidence Low Incidence Early Childhood Early Childhood Minority Minority Related Services Related Services

11 Measure 2 The percentage of scholars completing Special Education Personnel Preparation funded training programs who are knowledgeable and skilled in evidence-based practices - Praxis II Special Education (for teachers) - Other performance measures, for example, for related service personnel, or in states where measures other than Praxis II are used - Other performance measures, for example, for related service personnel, or in states where measures other than Praxis II are used Annual Performance Reports this year (FY 2004 grantees only); Annual Performance Reports this year (FY 2004 grantees only); Future collection through OSEP Student Data Report for all grantees Future collection through OSEP Student Data Report for all grantees

12 Measure 3 The percentage of Special Education Personnel Preparation funded scholars who exit training programs prior to completion due to poor academic performance Data collected through OSEP Student Data Report Data collected through OSEP Student Data Report

13 FY 2001 – FY 2005 Percent

14 Measure 4 The percentage of low-incidence positions that are filled by personnel who are fully qualified under IDEA Proposed plan – Data submitted by SEAs and collected through Section 618 data collection for Annual Report to Congress Data submitted by SEAs and collected through Section 618 data collection for Annual Report to Congress Data collection to begin school year Data collection to begin school year

15 Measure 5 The percentage of Special Education Personnel Preparation funded degree/certification program recipients who are working in the area(s) in which they were trained upon program completion Data collected through OSEP Student Data Report Data collected through OSEP Student Data Report

16 FY 2002 – FY 2005 Percent

17 Measure 6 The percentage of Special Education Personnel Preparation funded degree/certification program recipients who are working in the area(s) in which they were trained upon program completion and who are fully qualified under IDEA Data collected through Annual Performance Report this year from FY 2004 grantees Data collected through Annual Performance Report this year from FY 2004 grantees Future data collected on OSEP Student Data Report Future data collected on OSEP Student Data Report

18 Measure 7 Employed for three or more years: The percentage of degree/certification recipients who maintain employment for 3 or more years in the areas for which they were trained and who are fully qualified under IDEA This year data collected through sample of grantees (9 institutions) This year data collected through sample of grantees (9 institutions) Future data collected from National Center on Student Obligation database. Future data collected from National Center on Student Obligation database.

19 Measure 8 The percentage of funds expended on scholars who drop out of programs because of - The percentage of funds expended on scholars who drop out of programs because of - 1. Poor academic performance 2. Scholarship support being terminated when the Federal grant to their institution ends Data collected through OSEP Student Data Report Data collected through OSEP Student Data Report

20 SERVICE OBLIGATION

21 Service Obligation—Which Rules Apply and When? There are three relevant sets of rules in play with respect to the service obligation: 1. The December 9, 1999 regulations will apply to individuals supported by grants awarded in FY2004 or earlier.

22 Which Rules Apply and When? 2. The "Additional Requirements" section of the Personnel Preparation To Improve Services and Results for Children With Disabilities--Combined Priority and the Leadership notices for Personnel Preparation, published in the Federal Register on March 25, 2005, will apply to those awards made in FY 2005.

23 Which Rules Apply and When? 3. The Final Regulations implementing Section 662(h) of the Amendments to the Individuals with Disabilities Education Improvement Act (IDEA) of 2004 (Federal Register (see 71 FR pp ) became effective, July 5, 2006 and apply to those awards made in FY 2006 and later.

24 How Does Continuation Funding Affect the Rules? Grantees may receive continuation funding in later years; however, whatever rules were in effect in the year that the grants were initially awarded will apply to all future years of these grants.

25 How Will the Department Monitor the Service Obligation? A contract will be awarded to monitor the service obligation and provide a website and web-based data collection system. A contract will be awarded to monitor the service obligation and provide a website and web-based data collection system. The website will provide information. The website will provide information. A helpline will also be available. A helpline will also be available.

26 Frequently Asked Questions Please submit questions you would like answered. Providing your contact information would be helpful so that we can ask you clarifying questions, if necessary.

27 THANK YOU!