Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

STANISLAUS COUNTY OFFICE OF EDUCATION/ CENTRAL CALIFORNIA MIGRANT HEAD START CHILD OUTCOMES SYSTEM CHILD OUTCOMES SYSTEMS Training Plan * Outcomes Awareness.
Special Education Administration at a Crossroads: Training Special and General Education Administrators to Provide Educational Programming for All Students.
Refresher: Background on Federal and State Requirements.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
MPRRC U.S. Department of Education Office of Special Education Programs Special Education Technical Assistance Resources for Charter Schools.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
325K: Combined Priority for Personnel Preparation Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D Preparation of Leadership Personnel Webinar on the Annual Performance Report for Continuation Funding Presenter: Patricia Gonzalez, Ph.D. Office.
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
EQUIP is a public & private initiative to improve the quality of early care and education in Oregon. August, 2010 EQUIP is a public & private initiative.
Welcome! P ROGRAM M EETING B ONNIE D. J ONES, P ROGRAM L EAD J ULY 21, 2014 OSEP Personnel Development Program 7/21/2014.
State Performance Plan: A Two-Way Street Ruth Ryder Larry Wexler Division of Monitoring and State Improvement Planning.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
12/07/20101 Bidder’s Conference Call: ARRA Early On ® Electronic Enhancement to Individualized Family Service Plans (EE-IFSP) Grant and Climb to the Top.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
HECSE Quality Indicators for Leadership Preparation.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Julie R. Morales Butler Institute for Families University of Denver.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING FY 2009.
What Will it Take to Improve Outcomes for Young Children with Disabilities? Scaling-up the Use of Evidence Based Practices OSEP Project Director’s Conference.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
High Quality Objectives and Performance Measurement Office of Special Education Programs Courtney L. Brown, Ph.D. Center for Evaluation & Education Policy.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Revisions to the OSEP PPD Student Data Report for FY2006 July 18, 2007 Karen Schroll, Westat.
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
Leveraging Federal Resources: Teacher Quality, Research, and Program Improvement Peggi Zelinko Office of Innovation and Improvement (OII) Robert Ochsendorf.
OSEP/RTP Update Lou Danielson Director Research to Practice Office of Special Education Programs.
Welcome! PROGRAM MEETING July 23, 2012 OSEP Personnel Development Program 1.
The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting.
Special Education Teacher Preparation Program Improvement Grants (325T) Bonnie D. Jones, Ed. D. Education Program Specialist Office of Special Education.
Pennsylvania’s State Personnel Development Grant “Improving Student Results: A Focus on Highly Qualified Special Education Personnel” An Overview PDE Conference.
OSEP Project Directors’ Conference Managing Your Grants 101 Terry Jackson – OSEP Shedeh Hajghassemali – OSEP July 22, 2008.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
State Systemic Improvement Plan (SSIP) Office of Special Education January 20, 2016.
WHAT A GREAT IDEA!! Focusing on Results and Using IDEA (Individuals with Disabilities Education Act) Part D Investments to Support Improved Outcomes for.
Mission Possible: Improving Academic and Behavioral Results for Children with Disabilities through Sustained Research Based Professional Development Deborah.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Evidence-Based and Promising Practices to Increase Graduation and Improve Outcomes Dr. Loujeania Bost Dr. Catherine Fowler Dr. Matthew Klare.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Need For The Project SPDG Competition FY The NEED FOR THE PROJECT is the foundation of the SPDG application  Scoring criteria: the range of points.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Review, Revise and Amend from Procedures for State Board Policy 74
OSEP TA&D Program Performance Measurement
OSEP TA&D Program Performance Measurement
Monitoring Child Outcomes: The Good, the Bad, and the Ugly
2018 OSEP Project Directors’ Conference
Early Childhood and Family Outcomes
Refresher: Background on Federal and State Requirements
Measuring Child and Family Outcomes Conference August 2008
Presentation transcript:

Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum

Session Theme # 3 Part D programs have their own priorities for evidence, which project directors need to support

Objectives By the end of this presentation you will: Know the difference between program measurement and program management Understand how to use logic models for developing program measures See examples of how current program information is being used to enhance the Part D Programs.

The trickle up approach (1) Each OSEP/RTP Program has a logic model Each OSEP/RTP Program has an approved series of performance measures. Each OSEP/RTP Program funds projects

The trickle up approach (2) Number and Type of Measures by OSEP/RTP Program ProgramAnnualLong-termEfficiency Personnel Development 521 PTIC321 TA&D321 Tech and Media 321

The trickle up approach (3) How does each OSEP/RTP Program know that they are meeting their targets? o Your projects need to provide that How does each OSEP/RTP Program use the data to manage? o Keeping project officers informed of where the Program is relative to the target. o Using the measures to direct new grant applications to address how they will help OSEP/RTP meet their targets. o By meeting targets OSPE/RTP is in a stronger position to ask for additional funding.

Performance Measurement

Performance Management

Evaluation in Performance Management

Performance Management Steps 1.Assess – review program purpose and design 2.Plan – set strategic performance targets 3.Measure – measure program performance 4.Analyze – evaluate program results 5.Improve – implement program enhancements 6.Sustain – manage program effectiveness

Evaluation in the Context of PART The Program Assessment Rating Tool (PART) was developed to assess the effectiveness of federal programs, and help inform management actions, budget requests, and legislative proposals directed at achieving results. The PART Assess if and how program evaluation is used to inform program planning and to corroborate program results. 4 sections of a PART review: o Program purpose and design (20%) o Strategic planning (10%) o Program management (20%) o Program results (50%)

Evaluation and PART (2) Question 2.6 “…[a]re independent evaluations of sufficient scope and quality conducted on a regular basis, or as needed to support program improvements, and evaluate effectiveness and relevance to the problem, interest, or need.” Question 4.5 asks if “independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results.”

What is a logic model? The underlying rationale for the evaluand's design, usually an explanation of whythe various components of the program (for example) have been created and what-and how-each of them is supposed to contribute towards achieving the desired outcomes. Some logic models include environmental factors, some do not. Note that we are talking about the alleged ‘theory of operation’ and the evaluation may discover considerable discrepancies between this-the view of the designers and possibly also the managers of the program-and the views of the service deliverers who are the hands-on staff engaged in dealing with the recipients of service or product.”

Simplified Logic Model Inputs What the program needs to accomplish it outcomes. Activities What programs do to accomplish their outcomes. Outputs What programs produce to accomplish their outcomes. Outcomes What changes the program expects based on their inputs, activities and outputs. [Short- term, Intermediate, and Long- term (impact)]

*Fully Qualified = Highly Qualified for special education teacher; Qualified for paraprofessional/aide; Fully Certified for administrator/coordinator, for related or supportive services in a school setting, or for teacher, related services, or supportive services in early intervention, early childhood. OSEP Personnel Development Program - Logic Model A blueprint to enhance understanding of the Program Goal: To improve results for children with disabilities and their families INPUTS OUTPUTSOUTCOMES Program Investments ActivitiesParticipationShort TermLong Term Project Officers Funding Evidence-Based & Best Practices Research Program & Grants Policy Technology Process Measures Outcome Measures Intermediate Increased supply of fully qualified personnel* with awareness and knowledge of EBP & best practices Increased collaboration - SEAs, IHEs, LEAs & lead agencies Increased training opportunities Increased placement of fully qualified* personnel Improved personnel development infrastructures Increased retention of fully qualified* personnel in workforce – schools & programs, educational & lead agencies, & IHEs. Grantees Faculty Students in IHEs SEAs & LEAs Lead Agencies Practitioners Administrators Children Families Train personnel Redesign & build models & networks for collaboration Develop and disseminate resources CONTEXT Federal Law & Regs Time Develop priorities & manage competitions Monitor grants Build models & networks for collaboration

Measure 1.1 (Annual) The percentage of Special Education Personnel Preparation projects that incorporate evidence- based practices in the curriculum. Home Measure 1.2 (Long-Term) The percentage of scholars completing Special Education Personnel Preparation funded training programs who are knowledgeable and skilled in evidence-based practices for infants, toddlers, children, and youth with disabilities.

Measure 2.1 (Annual) The percentage of Special Education Personnel Preparation funded scholars who exit training programs prior to completion due to poor academic performance. Home

The percentage of low incidence positions that are filled by personnel who are fully qualified under IDEA. Measure 2.2 (Long-Term) Home The percentage of Special Education Personnel Preparation funded degree/certification recipients who are working in the area(s) for which they were trained upon program completion and who are fully qualified under IDEA. Measure 2.4 (Annual)

The percentage of degree/certification recipients who maintain employment for 3 or more years in the area(s) for which they were trained and who are fully qualified under IDEA. Measure 2.5 (Long-Term) Home

Example from the Personnel Development Program Measure 1.1 of 2 (Annual) The percentage of Special Education Personnel Preparation projects that incorporate evidence-based practices in the curriculum.

Example from the TA& D Program Measure 1.1 of 1 (Long-term) The percentage of school districts and service agencies receiving Special Education Technical Assistance and Dissemination services regarding scientifically- or evidence-based practices for infants, toddlers, children and youth with disabilities that implement those practices.

Questions Contact information Herbert. M. Baum, Ph.D. ICF Macro