Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.

Similar presentations


Presentation on theme: "Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation."— Presentation transcript:

1 Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t
Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation National Institute of General Medical Sciences MORE Program Directors Meeting Colorado Springs, Colorado June 12, 2009

2 “Program Evaluation…!?”

3 Program Evaluation: What is it?
Program evaluations are individual, systematic studies that use objective measurement and analysis to answer specific questions about how well a program is working. - GAO/GGD Program Evaluation

4 Evaluation Answers Questions Such As….
Does it work? How well does it work? Does it do what we want it to? Does it work for the reasons we think it does? Is it cost effective? Are the benefits worth it? What are the unintended consequences?

5 Research vs. Program Evaluation
Judges merit or worth Policy & program interests of stakeholders paramount Provides information for decision-making on specific program Conducted within setting of changing actors, priorities, resources, & timelines Research Produces generalizable knowledge Scientific inquiry based on intellectual curiosity Advances broad knowledge and theory Controlled setting Research seeks to ‘prove, evaluation seeks to improve.” Michael Quinn Patton

6 Why bother? To gain insight about a program and its operations
To improve practice - to modify or adapt practices to enhance the likelihood of success To assess effects – to determine if we’re meeting our goals and provide evidence of effectiveness

7 Guidelines for Conducting Successful Evaluations
Invest heavily in planning early on Integrate evaluation into ongoing program activities Use knowledgeable, experienced evaluators

8 Evaluator Skills Evaluation theory and methods
Research methods (design, planning, statistics, qualitative and quantitative methods) Data collection, analysis and interpretation Communication and Interpersonal skills Content area skills Project management Ethics At universities and colleges, this type of expertise is found in the social and behavioral sciences departments!

9 of the total grant amount.
Evaluation Costs The National Science Foundation's “rule of thumb” about evaluation budgets is 10% of the total grant amount.

10 Types of Evaluations Needs Assessment
What is nature & extent of the issues program should address? Planning phase Feasibility Study Is evaluation appropriate and/or affordable? Maturity/timeliness issue? Process or outcome evaluation produced Process Evaluation Is program is being conducted & producing output as planned? How can process can be improved? Outcome Evaluation To what extent have a program’s goals have been met? Needs Assessment – Usually looking at needs of stakeholders, developing approp programs goals and how design or modify a program to achieve those goals / Usually a tool for strategic planning and priority setting Feasibility study – What’s the best way to evaluate the program? Is this the right time to conduct an evaluation? Can it be conducted at a reasonable cost? Is the program mature enough?—reasonable to expect outcomes at this time? Determine which evaluation design and data collection strategies can and should be used Process Evaluation – Looking at program operations to determine if being conducted as planned, or whether output being produced, or how processes can be improved. Often looking at comparison group, a recognized standard of operations Outcome evaluation – examine program accomplishments and effects to determine if program meeting intermediate and long-term goals. Often comparing current program performance against prior program performance, comparable control group, or recognized standards of performance

11 Focus Evaluation Design Gather credible evidence
CDC Framework for Program Evaluation Steps For more info: Engage Stakeholders Use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Justify conclusions Focus Evaluation Design Milstein et al, Health Promotion Practice, July 2000, Vol 1(3): Gather credible evidence

12 Evaluation Standards Utility
Evaluations should serve the practical information needs of a given audience Feasibility Evaluations take place in the field and should be realistic, prudent, diplomatic and frugal Propriety The rights of individuals affected by evaluations should be protected Accuracy Evaluations should produce and convey accurate information about a program’s merit and/or worth Utility – Who need the information and what information do they need? Will the evaluation provide relevant, useful information in a timely manner? Feasibility – How much money, time, and effort can we put into this? Is the planned evaluation realistic given the time, resources and expertise available? Propriety – What steps need to be taken for the evaluation to be ethical and legal? Does it protect the rights and welfare of the individuals involved? Does it engage those affected by the program and the evaluation? Accuracy – What design will lead to accurate information? Will it produce valid and reliable findings? Guiding Principles for Evaluators, American Evaluation Association,

13 CDC Framework: Key Steps in Evaluation
Engage stakeholders Describe the program Focus the evaluation design Gather credible evidence Justify conclusions Ensure use and share lessons

14 Step 1- Engage Stakeholders
Who are the stakeholders? Those involved in program operations, those affected by the program operations, and users of evaluation results Before you can talk about evaluating a program, you have to agree on what the program actually is. Understand mission, objectives, strategies Uncover differences of opinion Set frame of reference for later decisions Agree on goals and milestones You want to describe it in enough detail so you have solid understanding of its mission, objective and strategies Want to think about the needs for a program, its expected effects, activities, resources, stage of development and contex

15 Step 2 - Describe the Program
What are the goals and specific aims of the program? What problem or need is it designed to address? What are the measurable objectives? What are the strategies to achieve the objectives? What are the expected effects? What are the resources and activities? How is the program supposed to work? Before you can talk about evaluating a program, you have to agree on what the program actually is. Understand mission, objectives, strategies Uncover differences of opinion Set frame of reference for later decisions Agree on goals and milestones You want to describe it in enough detail so you have solid understanding of its mission, objective and strategies Want to think about the needs for a program, its expected effects, activities, resources, stage of development and contex

16 “I think you should be more explicit here in Step Two.”
By Sidney Harris, Copyright , The New Yorker

17 Step 3 - Focus the evaluation design
What do you want to know? Consider the purpose, uses, questions, methods, roles, budgets, deliverables etc. An evaluation cannot answer all questions for all stakeholders. Consider political viability, resources, practical procedures, etc.

18 Step 4 - Gather credible evidence
Evidence must be believable, trustworthy, and relevant Information scope, sources, quality, logistics Methodology & data collection Who is studied and when What counts as evidence? Must have evidence seen as trustworthy & relevant Depends on questions asked and stakeholders’ views Should be defensible and reliable Systematic information sampling design use of comparison groups timing and frequency of data collection issues of bias (sample and respondent)

19 Step 5 - “Justify” Conclusions
Consider data: Analysis and synthesis - determine findings Interpretation - what do findings mean? Judgments - what is the value of findings based on accepted standards? Recommendations – - what claims can be made? - what are the limitations of your design? Linked to evidence & consistent with agreed upon values or standards

20 Step 6 - Use and share results
Share lessons learned with stakeholders! Provide feedback, offer briefings. disseminate findings What steps will you take to disseminate findings? Provide feedback to stakeholders Schedule follow up meetings with users Plan, prepare, and follow through

21 Next Session – Moving from the abstract to the concrete
Are you overwhelmed? Next Session – Moving from the abstract to the concrete


Download ppt "Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation."

Similar presentations


Ads by Google