Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Introduction to Monitoring and Evaluation
Earmark Grant Evaluation: An Introduction and Overview May 2005 Presented by: Nancy Hewat, Senior Project Manager Public Policy Associates, Inc. 119 Pere.
How to Develop a Program Logic Model. Learning objectives By the end of this presentation, you will be able to: Describe what a logic model is, and how.
Designing an Effective Evaluation Strategy
Program Evaluation Essentials. WHAT is Program Evaluation?
4.01A ACQUIRE FOUNDATIONAL KNOWLEDGE OF MARKETING- INFORMATION MANAGEMENT TO UNDERSTAND ITS NATURE AND SCOPE WF SPORTS & ENTERTAINMENT MARKETING II.
Business research methods: data sources
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Unit 12 Employability and Career Development
Overview of the research process. Purpose of research  Research with us since early days (why?)  Main reasons: Explain why things are the way they are.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
The Evaluation Plan.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
A Model for EAP Training Development Zhiyun Zhang IDE 632 — Instructional Design & Development II Instructor : Dr. Gerald S. Edmonds.
Too expensive Too complicated Too time consuming.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
MANPOWER PLANNING.
1 A Dynamic Tool for Supervisors Performance Evaluation Training for Supervisors.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
for quality and accountability
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Implementation and follow up Critically important but relatively neglected stages of EIA process Surveillance, monitoring, auditing, evaluation and other.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Frances Coupe – Head of Partnership Commissioning, CSF Helen Foye – Commissioning Manager CSF Charis Harbridge – St Albans Bereavement Network Planning.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Kathy Corbiere Service Delivery and Performance Commission
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
© Pearson Education Canada, 2005 Business Communication Essentials, Canadian Edition Chapter Understanding and Planning Business Reports and Proposals.
IT-465 Introduction to Lean part Two. IT-465 Lean Manufacturing2 Introduction Waste Walks and Spaghetti Charts Outcomes Understand what a waste walk is.
Moving Toward Self-Sufficiency ________________________________________________________________ Preparing Mississippi’s Workforce Presentation for Reaching.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Are we there yet? Evaluating your graduation SiMR.
Organizational Behavior (MGT-502) Lecture-43. Summary of Lecture-42.
Logistics Community Competency Initiative Update November 20, 2006 Mark Tregar, CNAC Judith Bayliss, DAU “The Total Force must continue to adapt to different.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Logic Models How to Integrate Data Collection into your Everyday Work.
Strategies for Supporting Home Visitors with Data Collection
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Monitoring and Evaluation
America’s Promise Evaluation What is it and what should you expect?
Quality Department
Introduction to CPD Quality Assurance
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Using Logic Models in Project Proposals
Managing Personnel Advanced Marketing.
M & E Plans and Frameworks
Presentation transcript:

Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan (517)

Presentation Topics uThe evaluation requirement for earmark grants uEvaluation overview – or – “Where’s the upside?” uPlanning the evaluation uThe evaluation process for earmark grants uDiscussion We’ll do clarifications on the fly, broader discussion at the end.

The Evaluation Requirement

Each grantee must … uConduct or commission an evaluation uSubmit evaluation plan uUse the evaluation template uSubmit evaluation report shortly after completion of project activities

Evaluation Overview – or – “Where’s the upside?”

Program evaluation is... uThe systematic collection of information about the subject of the evaluation uUsed to make decisions about organization’s or program’s: 4 Creation 4 Improvement 4 Effectiveness

Evaluation is a mindset … uWe are all evaluators uEvaluation is continuous uEvaluation looks forward, not just backward uInvolves organizational learning uMeans people working together

Evaluation allows you to examine... uWhat’s working well uWhat is not uHow to improve There is no bad news, only news!

Evaluation requires comparison... 4of the same group over time pre- and post-tests trends in community-level data 4of two comparable groups l at one point in time l over time 4of your group to a larger group l county compared to state

Our Approach: Utilization-Focused Evaluation uFocuses on intended uses and users uIs inherently participatory and collaborative by actively involving primary intended users in all aspects of the evaluation uLeads to ongoing, longer-term commitment to using evaluation logic and building a culture of learning in a program or organization uSymbiotic rather than parasitic

Benefits of Evaluation uProgram/organizational improvement uAccountability to funders and others uPlanning uProgram description for stakeholders uPublic relations uFund raising uPolicy decision making Evaluation has lots of upside!

Planning the Evaluation

Elements of the Evaluation Plan uWho conducts the evaluation? 4Internal or external? 4Experienced or novice? uWhen do they do it? 4Along the way or after the fact? uHow much do they do? 4The level of intensity must fit the project 4Too much diverts resources, too little leaves unanswered questions uWhat exactly do they do? 4Six major steps

Evaluation Steps 1. Clarify project & goals2. Establish measures 3. Collect data 4. Analyze data 5. Prepare reports 6. Improve project

Step 1: Clarify Project & Goals uThinking about goals  What are you trying to accomplish?  What would success look like?  What is the difference between the current state of affairs and what you are trying to create? uExample of a goal statement: “Increase incomes of low-income families in the region through training for entry-level jobs that have career ladders leading to good jobs.”

Does the Project Hang Together? uAre the expected outcomes realistic? uAre there enough resources? uDo the customers like the product? uDoes the organization have the right skills? Logic models help answer these questions.

A Simple Logic Model Things needed to run the project: People, resources, money, etc. What you do: Market, recruit, design, train, place, etc. Direct results of activities: Training completers, credentials awarded, etc. Changes caused by the project: Jobs, wages, promotions, etc. InputsActivitiesOutputsOutcomes

Step 2: Establish Measures uDetermine performance measures  Must be quantifiable  Data must be available, reliable, and valid uExamples of measures: Activity: Number of training sessions Output: Number of trainees Outcome: Skill and credential gains Impact: Stronger local workforce

Step 3: Collect Data uIdentify data sources, such as:  Administrative records  Surveys, interviews, focus groups  Observation uGather data Design the instruments and procedures for collection. Conduct data collection periodically. uRecord data  Organize data.  Create database.  Verify data. Remember the measures!

Step 4: Analyze and Interpret Data uSort and sift: organize data for interpretation  Cross-tabs  Modeling uConduct data analysis to look for:  Changes over time  Progress relative to goals or standards  Differences between groups uTest preliminary interpretation This is the most creative step.

Step 5: Prepare Reports uDetermine reporting schedule uReport preliminary findings to key stakeholders and other audiences uGather reactions uIncorporate reactions uFinalize reporting products Different audiences need different types of reports.

Step 6: Improve Project uDeliver reporting products internally. uFacilitate strategic and operational planning. uImprove processes and results. A good evaluation will be more valuable to you than to DOL!

The Evaluation Process for Earmark Grants

Use the DOL Tools u“The Essential Guide for Writing an Earmark Grant Proposal” u“Evaluation Template for Earmark Grantees” (to be provided later)

Discussion

Thanks to … … for the use of the “Demystifying Evaluation” materials. Useful evaluation links: W.K. Kellogg Foundation: American Evaluation Association: Western Michigan University Evaluation Checklists:

Earmark Grant Evaluation: An Introduction and Overview May 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan (517)