Presentation is loading. Please wait.

Presentation is loading. Please wait.

Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Similar presentations


Presentation on theme: "Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training."— Presentation transcript:

1 Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training

2 Part of Planning Process Evaluation is an upfront activity in the design or planning phase of a program. Evaluation is not an after-program activity.

3 Why Outcomes? Today, in a time of continued reduction in government funding, Extension professionals are challenged more than ever before to document outcomes of programs and address stakeholder demands for accountability.

4 Review of Part of Bennett's Hierarchy As one moves up the hierarchy, the evidence of program impact gets stronger. Reactions KASA Practice Change People Involvement Activities End Results Resources

5 Collecting impact data on programs is costly, time consuming, and requires skill. (But not impossible!) Extension professionals are expected to evaluate a minimum of one program a year at impact level.

6 Example A pre/post measure can assess short term outcomes on knowledge, attitudes, skills, and aspirations (motivation to change). A plan for participant follow-up is required to assess behavior or practice change.

7 Plan early Plan early on what is needed with cost, time, skills (data collection, analysis, interpretation), and resources that are needed to evaluate an extension program. Work with Institute teams/ groups.

8 Evaluating programs at the lower levels (inputs, participation, collaboration, activities, and reactions) may require little effort and are less expensive. This is process evaluation.

9 Process Evaluation Process evaluation, also called formative evaluation, helps program staff to assess ongoing programs for improvement and implementation. Examples: program fidelity, reaching target audiences

10 Outcome Evaluation Documenting impact or community-level outcomes requires skills relative to questionnaire development, data collection and analysis, interpretation and reporting.

11 Summative evaluation, also called impact or outcomes evaluation, may require understanding of evaluation designs, data collection at multiple points, and sophisticated statistical analyses such as Analysis of Covariance and the use of covariates.

12 A Framework for Linking Costs and Program Outcomes Using Bennett's Hierarchy Process (Formative) EvaluationOutcome Evaluation Cost & OutcomesInputsActivitiesParticipationReactionsKASA Practice/ Behavior ChangeSEEC Short TermXXXXXXXXX---- Inter- mediate XXX----XXXXXXXXX Long TermXXX----XXXXXXXXX X = Low cost, effort, and evidence; XX = requires questionnaire development, data collection and analysis skills; XXX = requires understanding of evaluation designs, multiple data collection, additional analysis, skills, interpretation; XXXX—all of the above, time, increased costs, potentially resulting in stronger evidence of program impact.

13 Professional Development Plans for professional development are captured in MI PRS, consider building skills in evaluation. Develop with Institute work teams program evaluation plans that fit with logic models.

14 To make an Evaluation Plan: 1.Decide if the program is ready for formative/process or summative/outcome evaluation. 2.Link program objectives to evaluation questions that address community outcomes.

15 To make an Evaluation Plan, Cont. 3.Identify key indicators for evaluation (make sure they are measurable and relevant). 4.Consider evaluation costs (follow- up techniques and comparison groups used in summative designs are more expensive). 5.Develop a cost matrix.

16 Tracking program and project processes and outputs, as well as outcomes, will require data collection and analysis systems outside of MI PRS. Link program costs and cost of evaluation to the outcomes.

17 Conclusion In the end, evaluation questions that address the “so what” issue are connected to outcomes and costs, and ultimately justify the value of Extension programs to public good.

18 Key Reference Radhakrishna, R., & Bowne, C. (2010). Viewing Bennett’s hierarchy from a different lens: Implications for Extension program evaluation. Journal of Extension, 48 (6). Retrieved 1/24/11 at: http://www.joe.org/joe/2010december/tt1.php http://www.joe.org/joe/2010december/tt1.php

19 MSUE Resources Organizational Development webpage – Planning, Evaluation, and Reporting section

20 Evaluation Resources will Grow!

21 Other Extension materials on Evaluation….with future MSU specific resources to be released in 2011

22 MSU Evaluation Specialist  Assists with work teams to develop logic model objectives and evaluation strategies  Consults on evaluation designs  Provides guidance to data analysis and selecting measures  Develops and delivers educational programs related to Extension program evaluation  Facilitates evaluation plan development or brainstorming for Institute work teams

23 Organizational Development team member Dr. Cheryl Peters, Evaluation Specialist Statewide coverage cpeters@anr.msu.edu 989-734-2168 (Presque Isle) 989.734.4116 Fax Campus Office: Room 11, Agriculture Hall. Campus Phone: 517-432-7605 Skype: cpeters.msue

24 MI PRS Resources


Download ppt "Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training."

Similar presentations


Ads by Google