Presentation is loading. Please wait.

Presentation is loading. Please wait.

Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette.

Similar presentations


Presentation on theme: "Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette."— Presentation transcript:

1 Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com

2 Presentation Topics uThe evaluation requirement for earmark grants uEvaluation overview – or – “Where’s the upside?” uPlanning the evaluation uThe evaluation process for earmark grants uDiscussion We’ll do clarifications on the fly, broader discussion at the end.

3 The Evaluation Requirement

4 Each grantee must … uConduct or commission an evaluation uSubmit evaluation plan uUse the evaluation template uSubmit evaluation report shortly after completion of project activities

5 Evaluation Overview – or – “Where’s the upside?”

6 Program evaluation is... uThe systematic collection of information about the subject of the evaluation uUsed to make decisions about organization’s or program’s: 4 Creation 4 Improvement 4 Effectiveness

7 Evaluation is a mindset … uWe are all evaluators uEvaluation is continuous uEvaluation looks forward, not just backward uInvolves organizational learning uMeans people working together

8 Evaluation allows you to examine... uWhat’s working well uWhat is not uHow to improve There is no bad news, only news!

9 Evaluation requires comparison... 4of the same group over time pre- and post-tests trends in community-level data 4of two comparable groups l at one point in time l over time 4of your group to a larger group l county compared to state

10 Our Approach: Utilization-Focused Evaluation uFocuses on intended uses and users uIs inherently participatory and collaborative by actively involving primary intended users in all aspects of the evaluation uLeads to ongoing, longer-term commitment to using evaluation logic and building a culture of learning in a program or organization uSymbiotic rather than parasitic

11 Benefits of Evaluation uProgram/organizational improvement uAccountability to funders and others uPlanning uProgram description for stakeholders uPublic relations uFund raising uPolicy decision making Evaluation has lots of upside!

12 Planning the Evaluation

13 Elements of the Evaluation Plan uWho conducts the evaluation? 4Internal or external? 4Experienced or novice? uWhen do they do it? 4Along the way or after the fact? uHow much do they do? 4The level of intensity must fit the project 4Too much diverts resources, too little leaves unanswered questions uWhat exactly do they do? 4Six major steps

14 Evaluation Steps 1. Clarify project & goals2. Establish measures 3. Collect data 4. Analyze data 5. Prepare reports 6. Improve project

15 Step 1: Clarify Project & Goals uThinking about goals  What are you trying to accomplish?  What would success look like?  What is the difference between the current state of affairs and what you are trying to create? uExample of a goal statement: “Increase incomes of low-income families in the region through training for entry-level jobs that have career ladders leading to good jobs.”

16 Does the Project Hang Together? uAre the expected outcomes realistic? uAre there enough resources? uDo the customers like the product? uDoes the organization have the right skills? Logic models help answer these questions.

17 A Simple Logic Model Things needed to run the project: People, resources, money, etc. What you do: Market, recruit, design, train, place, etc. Direct results of activities: Training completers, credentials awarded, etc. Changes caused by the project: Jobs, wages, promotions, etc. InputsActivitiesOutputsOutcomes

18 Step 2: Establish Measures uDetermine performance measures  Must be quantifiable  Data must be available, reliable, and valid uExamples of measures: Activity: Number of training sessions Output: Number of trainees Outcome: Skill and credential gains Impact: Stronger local workforce

19 Step 3: Collect Data uIdentify data sources, such as:  Administrative records  Surveys, interviews, focus groups  Observation uGather data Design the instruments and procedures for collection. Conduct data collection periodically. uRecord data  Organize data.  Create database.  Verify data. Remember the measures!

20 Step 4: Analyze and Interpret Data uSort and sift: organize data for interpretation  Cross-tabs  Modeling uConduct data analysis to look for:  Changes over time  Progress relative to goals or standards  Differences between groups uTest preliminary interpretation This is the most creative step.

21 Step 5: Prepare Reports uDetermine reporting schedule uReport preliminary findings to key stakeholders and other audiences uGather reactions uIncorporate reactions uFinalize reporting products Different audiences need different types of reports.

22 Step 6: Improve Project uDeliver reporting products internally. uFacilitate strategic and operational planning. uImprove processes and results. A good evaluation will be more valuable to you than to DOL!

23 The Evaluation Process for Earmark Grants

24 Use the DOL Tools u“The Essential Guide for Writing an Earmark Grant Proposal” u“Evaluation Template for Earmark Grantees” (to be provided later)

25 Discussion

26 Thanks to … … for the use of the “Demystifying Evaluation” materials. Useful evaluation links: W.K. Kellogg Foundation: www.wkkf.org/Programming/Overview.aspx?CID=281 www.wkkf.org/Programming/Overview.aspx?CID=281 American Evaluation Association: www.eval.org/EvaluationLinks/default.htm www.eval.org/EvaluationLinks/default.htm Western Michigan University Evaluation Checklists: www.wmich.edu/evalctr/checklists/checklistmenu.htm www.wmich.edu/evalctr/checklists/checklistmenu.htm

27 Earmark Grant Evaluation: An Introduction and Overview May 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com


Download ppt "Earmark Grant Evaluation: An Introduction and Overview May 19, 2005 Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette."

Similar presentations


Ads by Google