Presentation on theme: "Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation."— Presentation transcript:
1Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPHOffice of Program Analysis and EvaluationNational Institute of General Medical SciencesMORE Program Directors MeetingColorado Springs, ColoradoJune 12, 2009
3Program Evaluation: What is it? Program evaluations are individual, systematic studies that use objective measurement and analysis to answer specific questions about how well a program is working.- GAO/GGD Program Evaluation
4Evaluation Answers Questions Such As…. Does it work?How well does it work?Does it do what we want it to?Does it work for the reasons we think it does?Is it cost effective?Are the benefits worth it?What are the unintended consequences?
5Research vs. Program Evaluation Judges merit or worthPolicy & program interests of stakeholders paramountProvides information for decision-making on specific programConducted within setting of changing actors, priorities, resources, & timelinesResearchProduces generalizable knowledgeScientific inquiry based on intellectual curiosityAdvances broad knowledge and theoryControlled settingResearch seeks to ‘prove, evaluation seeks to improve.” Michael Quinn Patton
6Why bother? To gain insight about a program and its operations To improve practice - to modify or adapt practices to enhance the likelihood of successTo assess effects – to determine if we’re meeting our goals and provide evidence of effectiveness
7Guidelines for Conducting Successful Evaluations Invest heavily in planning early onIntegrate evaluation into ongoing program activitiesUse knowledgeable, experienced evaluators
8Evaluator Skills Evaluation theory and methods Research methods (design, planning, statistics, qualitative and quantitative methods)Data collection, analysis and interpretationCommunication and Interpersonal skillsContent area skillsProject managementEthicsAt universities and colleges, this type of expertise is found in the social and behavioral sciences departments!
9of the total grant amount. Evaluation CostsThe National Science Foundation's “rule of thumb” about evaluation budgets is 10%of the total grant amount.
10Types of Evaluations Needs Assessment What is nature & extent of the issues program should address?Planning phaseFeasibility StudyIs evaluation appropriate and/or affordable?Maturity/timeliness issue?Process or outcome evaluation producedProcess EvaluationIs program is being conducted & producing output as planned?How can process can be improved?Outcome EvaluationTo what extent have a program’s goals have been met?Needs Assessment – Usually looking at needs of stakeholders, developing approp programs goals and how design or modify a program to achieve those goals / Usually a tool for strategic planning and priority settingFeasibility study – What’s the best way to evaluate the program? Is this the right time to conduct an evaluation? Can it be conducted at a reasonable cost? Is the program mature enough?—reasonable to expect outcomes at this time? Determine which evaluation design and data collection strategies can and should be usedProcess Evaluation – Looking at program operations to determine if being conducted as planned, or whether output being produced, or how processes can be improved. Often looking at comparison group, a recognized standard of operationsOutcome evaluation – examine program accomplishments and effects to determine if program meeting intermediate and long-term goals. Often comparing current program performance against prior program performance, comparable control group, or recognized standards of performance
11Focus Evaluation Design Gather credible evidence CDC Framework for Program EvaluationStepsFor more info:Engage StakeholdersUse and share lessons learnedDescribe the programStandardsUtilityFeasibilityProprietyAccuracyJustify conclusionsFocus Evaluation DesignMilstein et al, Health Promotion Practice, July 2000, Vol 1(3):Gather credible evidence
12Evaluation Standards Utility Evaluations should serve the practical information needs of a given audienceFeasibilityEvaluations take place in the field and should be realistic, prudent, diplomatic and frugalProprietyThe rights of individuals affected by evaluations should be protectedAccuracyEvaluations should produce and convey accurate information about a program’s merit and/or worthUtility – Who need the information and what information do they need? Will the evaluation provide relevant, useful information in a timely manner?Feasibility – How much money, time, and effort can we put into this? Is the planned evaluation realistic given the time, resources and expertise available?Propriety – What steps need to be taken for the evaluation to be ethical and legal? Does it protect the rights and welfare of the individuals involved? Does it engage those affected by the program and the evaluation?Accuracy – What design will lead to accurate information? Will it produce valid and reliable findings?Guiding Principles for Evaluators, American Evaluation Association,
13CDC Framework: Key Steps in Evaluation Engage stakeholdersDescribe the programFocus the evaluation designGather credible evidenceJustify conclusionsEnsure use and share lessons
14Step 1- Engage Stakeholders Who are the stakeholders?Those involved in program operations, those affectedby the program operations, and usersof evaluation resultsBefore you can talk about evaluating a program, you have to agree on what the program actually is.Understand mission, objectives, strategiesUncover differences of opinionSet frame of reference for later decisionsAgree on goals and milestonesYou want to describe it in enough detail so you have solid understanding of its mission, objective and strategiesWant to think about the needs for a program, its expected effects, activities, resources, stage of development and contex
15Step 2 - Describe the Program What are the goals and specific aims of the program?What problem or need is it designed to address?What are the measurable objectives? What are the strategies to achieve the objectives?What are the expected effects?What are the resources and activities?How is the program supposed to work?Before you can talk about evaluating a program, you have to agree on what the program actually is.Understand mission, objectives, strategiesUncover differences of opinionSet frame of reference for later decisionsAgree on goals and milestonesYou want to describe it in enough detail so you have solid understanding of its mission, objective and strategiesWant to think about the needs for a program, its expected effects, activities, resources, stage of development and contex
16“I think you should be more explicit here in Step Two.” By Sidney Harris, Copyright , The New Yorker
17Step 3 - Focus the evaluation design What do you want to know?Consider the purpose, uses, questions, methods, roles, budgets, deliverables etc.An evaluation cannot answer all questionsfor all stakeholders.Consider political viability, resources, practical procedures, etc.
18Step 4 - Gather credible evidence Evidence must bebelievable, trustworthy, and relevantInformation scope, sources, quality, logisticsMethodology & data collectionWho is studied and whenWhat counts as evidence?Must have evidence seen as trustworthy & relevantDepends on questions asked and stakeholders’ viewsShould be defensible and reliableSystematic informationsampling designuse of comparison groupstiming and frequency of data collectionissues of bias (sample and respondent)
19Step 5 - “Justify” Conclusions Consider data:Analysis and synthesis- determine findingsInterpretation- what do findings mean?Judgments- what is the value of findings based on accepted standards?Recommendations –- what claims can be made?- what are the limitations of your design?Linked to evidence & consistent with agreed upon values or standards
20Step 6 - Use and share results Share lessons learned with stakeholders!Provide feedback, offer briefings. disseminate findingsWhat steps will you taketo disseminate findings?Provide feedback to stakeholdersSchedule follow up meetings with usersPlan, prepare, and follow through
21Next Session – Moving from the abstract to the concrete Are you overwhelmed?Next Session – Moving from the abstract to the concrete