How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs 202-245-7355.

Slides:



Advertisements
Similar presentations
Goals-Based Evaluation (GBE)
Advertisements

Office of Special Education Programs U.S. Department of Education With thanks to NECTAC & Christy Kavulic GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Refresher: Background on Federal and State Requirements.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
How to Write Goals and Objectives
Action Planning Guidance Illinois Public Health Institute.
Standards and Guidelines for Quality Assurance in the European
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Continuous Quality Improvement (CQI)
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Pat Mueller David Merves October 6, 2008 NH RESPONDS Evaluation Component.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
NJ - 1 Performance Measurement Reporting Development Services Group, Inc. Don Johnson For more information contact Development Services Group, Inc
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
EL-Civics Application Remember! If in a consortium: –Must complete Budget Detail page for each consortium member EL-Civics Application.
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
J.B. Speed School of Engineering University of Louisville KEEPS Energy Management Toolkit Step 3: Set Performance Goals Toolkit 3A: Set Energy Performance.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Performance-based Contracting and Maine’s State Personnel Development Grant (SPDG) Dawn Kliphan March 28, 2010.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
HECSE Quality Indicators for Leadership Preparation.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Why Do State and Federal Programs Require a Needs Assessment?
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
The School Effectiveness Framework
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
School Development Goal Development “Building a Learning Community”
1 One Common Voice – One Plan School Improvement Stage 3 Plan: Develop School Improvement Plan.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
Office of Service Quality
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Assessment, Accreditation, and Retention. “Thriving at the Liberal Arts College: Best Practices in Operations and Research” Dr. Claire Robinson, University.
Statewide System of Support For High Priority Schools Office of School Improvement.
Overview of the FY 2011 SPDG Competition Jennifer Coffey, Ph.D. State Personnel Development Grants Program Lead 1.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Classroom Assessments Checklists, Rating Scales, and Rubrics
for CIT Program Operation Resource Development Institute
Classroom Assessments Checklists, Rating Scales, and Rubrics
Grantee Guide to Project Performance Measurement
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
February 21-22, 2018.
Response to Intervention in Illinois
Refresher: Background on Federal and State Requirements
APR Informational Webinar
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs

Initial Steps in Developing the Outcome Evaluation Plan Identify the program or project’s mission and/or goals and objectives Identify all relevant and important outcomes that should be evaluated Select outcome indicators Identify data sources and data collection procedures

Step 1: Identify mission and/or goals and objectives Clarify the expectations and priorities of key stakeholders/collaborators Get a reasonable level of agreement on goals, strategies or activities and outcomes Develop Goals: Broad statements generally describing desired outcomes Develop Objectives: Measurable statements about outcomes (target performance) expected to be accomplished in a given time frame

More on Objectives… Objectives require detail and must include a target group (who), what is to be done (activities), a time frame (when), and a target performance (how much). 80% (how much) of the 300 participating teachers (who) will indicate that the Transition Toolkit is useful, relevant and of high quality (what) on the second year follow-up survey (when).

Step 2: Identify relevant and important project outcomes Short-term Outcomes typically involve learning: awareness, knowledge, attitudes, skills Intermediate Outcomes typically involve action: behavior, practice, policies Long-term Outcomes typically involve conditions: social, economic, civic, environmental

REMEMBER! Focus on Short and Intermediate- term outcomes that can be completed within the grant period

Outcomes are not Outputs! Outputs are the direct products of program activities, usually measured by “volume”, such as the number of classes taught or number of participants served

Sources of information on Program Outcomes Legislation and regulations Purpose statements contained in the RFP Strategic plans, SPPs or APRs State data systems Program descriptions or annual reports Discussions with stakeholders or collaborators Complaint information Performance measures from government agencies or other programs

Step 3: Select outcome indicators Each identified outcome needs to be translated into one or more outcome indicators that state specifically what is to be measured (e.g., the # of teachers passing a test)

Checklist for Outcome Indicators Does each indicator measure some important aspect of the outcome? Does each indicator start with a numerical designation such as : incidence, percentage, rate, or proportion of? Does your list of indicators cover all your outcomes?

Checklist for Outcome Indicators-continued Does your list indicators cover “quality” characteristics, such as timeliness of services? Is the wording of your indicator sufficiently specific? Avoid terms like “appropriate”. What is the feasibility and cost of collecting the indicator? Note that sometimes costly indicators are the most important and should be retained.

Step 4: Identify data sources and data collection procedures Determine if a research design can be used to evaluate effectiveness. There are several quasi- experimental designs that can be readily applied to program evaluation. Identify data sources, such as: extant agency/program records, performance assessments, surveys, observer ratings, and interview data.

Considerations in Determining Data Collection Procedures When will data be collected? Consider your design… When entering the program When completing the program Fixed interval after entering Fixed interval after completing Combination of above

Considerations in Determining Data Collection Procedures-continued Who is considered a participant? Include all participants or a sample? Who will collect the data? How will the data be analyzed?

The Evaluation Plan Implementation Questions Using the information concerning, goals, objectives, strategies/activities and outcomes, develop evaluation questions on implementation: Were the activities completed as intended, on time and did they result in the planned outputs? These questions provide a feedback loop for the purposes of on-going project monitoring.

The Evaluation Plan Outcome Questions Using the information concerning, goals, objectives, strategies/activities and outcomes, develop evaluation questions on impact/effectiveness: How well did the activities address the objectives as measured by the indicators? What changed for the target group either over time or in comparison to another group?

The Evaluation Plan Methods For each evaluation question: Is a research design feasible/which ones? What are the data sources? What methods will be used to collect the data? How might the data be analyzed and reported?

The Evaluation Plan Timelines For each evaluation question: When will data be collected? When will data be reported or used?

The Evaluation Plan Personnel Responsible Who is responsible for data collection, analysis and reporting at each point in the timeline?

SPDG Evaluation Criteria The extent to which the methods of evaluation are thorough, feasible, and appropriate to the goals, objectives and outcomes of the proposed project. The extent to which the methods of evaluation provide for examining the effectiveness of project implementation strategies.

SPDG Evaluation Criteria - continued The extent to which the methods of the evaluation include the use of the objective performance measures that are clearly related to intended outcomes of the project and will produce quantitative and qualitative data to the extent possible.

SPDG Evaluation Criteria - continued The extent to which the methods of evaluation will provide performance feedback and permit assessment of progress toward achieving intended outcomes

Performance Measures The percent of personnel receiving professional development through the SPDG Program based on scientific- or evidence- based instructional practices. The percent of SPDG projects that have implemented personnel development/training activities that are aligned with improvement strategies identified in their State Performance Plan.

Performance Measures Continued The percent of professional development/training activities provided through the SPDG Program based on scientific- or evidence-based instructional/behavioral practices. The percent of professional development/training activities based on scientific- or evidence-based instructional/behavioral practices, provided through the SPDG Program, that are sustained through ongoing and comprehensive practices (e.g., mentoring, coaching, structured guidance, modeling, continuous inquiry, etc.).

Performance Measures Continued In States with SPDG projects that have special education teacher retention as a goal, the Statewide percent of highly qualified special education teachers in State-identified professional disciplines (e.g., teachers of children with emotional disturbance, deafness, etc.), consistent with sections 602(a)(10) and 612(a)(14) of IDEA, who remain teaching after the first three years of employment.

Performance Measures Continued The percent of SPDG projects that successfully replicate the use of scientific or evidence-based instructional/behavioral practices in schools. The percent of SPDG projects whose cost per personnel receiving professional development on scientific or evidence-based practices is within a specified range.