Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.

Slides:



Advertisements
Similar presentations
Planning Collaborative Spaces in Libraries
Advertisements

Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Using the CDC Evaluation Fwork to Avoid Minefields on the Road to Good Evaluation Presented to: 2002 National Asthma Conference October 24, 2002 By: Thomas.
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Response to Intervention (RtI) in Primary Grades
Introduction to Monitoring and Evaluation
From Research to Advocacy
Developing and Implementing a Monitoring & Evaluation Plan
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
Basic Principles of Program Evaluation Corinne Datchi-Phillips, Ph.D. CEBP Learning Institute May 26 th, 2010.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Lucila Beato UNMIL/HRPS
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Research Plan: Using Data to Create Impactful Pride Campaigns
Testing a Strategic Evaluation Framework for Incrementally Building Evaluation Capacity in a Federal R&D Program 27 th Annual Conference of the American.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Evaluation. Practical Evaluation Michael Quinn Patton.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Overview of Nursing Process, Clinical Reasoning, and Nursing Practice.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
Program Evaluation: A Pseudo-Case Study
How to Develop the Right Research Questions for Program Evaluation
“Walking Through the Steps and Standards” Presented by: Tom Chapel Focus On…
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Measuring Outcomes: Potential Approaches in Evaluating Public Health William M. Sappenfield, MD, MPH MCH EPI Program Team Leader Division of Reproductive.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
Neal D. Kohatsu, MD, MPH Medical Director May 17, 2012 Evaluation and Quality Improvement.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
HMP Evaluation Overview and Tools UNE-CCPH June 22, 2011 Presenters: Kira Rodriguez Michelle Mitchell Patrick Madden Pamela Bruno-MacDonald.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Evaluation Approaches, Frameworks & Designs HSC 489 – Chapter 14.
Evaluation design and implementation Puja Myles
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Evaluation and Implementation 21 October 2015 PUBH 535.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
AES 2010 Kate McKegg, Nan Wehipeihana, Jane Davidson.
Evaluation What is evaluation?
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
A Systematic Approach for Evaluating Health Related Programs: Adaptation for Community- based Participatory Research and Clinical Application to Reduce.
Evaluating the Quality and Impact of Community Benefit Programs
Incorporating Evaluation into a Clinical Project
Designing Effective Evaluation Strategies for Outreach Programs
Right-sized Evaluation
Data is your friend: how to incorporate evaluation into your research
EVALUATION THEORY AND MODEL
A Framework for Program Evaluation
TECHNOLOGY ASSESSMENT
Elements of evaluation quality: questions, answers, and resources
Presentation transcript:

Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation National Institute of General Medical Sciences MORE Program Directors Meeting Colorado Springs, Colorado June 12, 2009

“Program Evaluation…!?”

Program Evaluation: What is it? Program evaluations are individual, systematic studies that use objective measurement and analysis to answer specific questions about how well a program is working. - GAO/GGD-00-204 Program Evaluation

Evaluation Answers Questions Such As…. Does it work? How well does it work? Does it do what we want it to? Does it work for the reasons we think it does? Is it cost effective? Are the benefits worth it? What are the unintended consequences?

Research vs. Program Evaluation Judges merit or worth Policy & program interests of stakeholders paramount Provides information for decision-making on specific program Conducted within setting of changing actors, priorities, resources, & timelines Research Produces generalizable knowledge Scientific inquiry based on intellectual curiosity Advances broad knowledge and theory Controlled setting Research seeks to ‘prove, evaluation seeks to improve.” Michael Quinn Patton

Why bother? To gain insight about a program and its operations To improve practice - to modify or adapt practices to enhance the likelihood of success To assess effects – to determine if we’re meeting our goals and provide evidence of effectiveness

Guidelines for Conducting Successful Evaluations Invest heavily in planning early on Integrate evaluation into ongoing program activities Use knowledgeable, experienced evaluators

Evaluator Skills Evaluation theory and methods Research methods (design, planning, statistics, qualitative and quantitative methods) Data collection, analysis and interpretation Communication and Interpersonal skills Content area skills Project management Ethics At universities and colleges, this type of expertise is found in the social and behavioral sciences departments!

of the total grant amount. Evaluation Costs The National Science Foundation's “rule of thumb” about evaluation budgets is 10% of the total grant amount.

Types of Evaluations Needs Assessment What is nature & extent of the issues program should address? Planning phase Feasibility Study Is evaluation appropriate and/or affordable? Maturity/timeliness issue? Process or outcome evaluation produced Process Evaluation Is program is being conducted & producing output as planned? How can process can be improved? Outcome Evaluation To what extent have a program’s goals have been met? Needs Assessment – Usually looking at needs of stakeholders, developing approp programs goals and how design or modify a program to achieve those goals / Usually a tool for strategic planning and priority setting Feasibility study – What’s the best way to evaluate the program? Is this the right time to conduct an evaluation? Can it be conducted at a reasonable cost? Is the program mature enough?—reasonable to expect outcomes at this time? Determine which evaluation design and data collection strategies can and should be used Process Evaluation – Looking at program operations to determine if being conducted as planned, or whether output being produced, or how processes can be improved. Often looking at comparison group, a recognized standard of operations Outcome evaluation – examine program accomplishments and effects to determine if program meeting intermediate and long-term goals. Often comparing current program performance against prior program performance, comparable control group, or recognized standards of performance

Focus Evaluation Design Gather credible evidence CDC Framework for Program Evaluation Steps For more info: http://www.cdc.gov/EVAL/ Engage Stakeholders Use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Justify conclusions Focus Evaluation Design Milstein et al, Health Promotion Practice, July 2000, Vol 1(3): 221-228 Gather credible evidence

Evaluation Standards Utility Evaluations should serve the practical information needs of a given audience Feasibility Evaluations take place in the field and should be realistic, prudent, diplomatic and frugal Propriety The rights of individuals affected by evaluations should be protected Accuracy Evaluations should produce and convey accurate information about a program’s merit and/or worth Utility – Who need the information and what information do they need? Will the evaluation provide relevant, useful information in a timely manner? Feasibility – How much money, time, and effort can we put into this? Is the planned evaluation realistic given the time, resources and expertise available? Propriety – What steps need to be taken for the evaluation to be ethical and legal? Does it protect the rights and welfare of the individuals involved? Does it engage those affected by the program and the evaluation? Accuracy – What design will lead to accurate information? Will it produce valid and reliable findings? Guiding Principles for Evaluators, American Evaluation Association, www.eval.org

CDC Framework: Key Steps in Evaluation Engage stakeholders Describe the program Focus the evaluation design Gather credible evidence Justify conclusions Ensure use and share lessons

Step 1- Engage Stakeholders Who are the stakeholders? Those involved in program operations, those affected by the program operations, and users of evaluation results Before you can talk about evaluating a program, you have to agree on what the program actually is. Understand mission, objectives, strategies Uncover differences of opinion Set frame of reference for later decisions Agree on goals and milestones You want to describe it in enough detail so you have solid understanding of its mission, objective and strategies Want to think about the needs for a program, its expected effects, activities, resources, stage of development and contex

Step 2 - Describe the Program What are the goals and specific aims of the program? What problem or need is it designed to address? What are the measurable objectives? What are the strategies to achieve the objectives? What are the expected effects? What are the resources and activities? How is the program supposed to work? Before you can talk about evaluating a program, you have to agree on what the program actually is. Understand mission, objectives, strategies Uncover differences of opinion Set frame of reference for later decisions Agree on goals and milestones You want to describe it in enough detail so you have solid understanding of its mission, objective and strategies Want to think about the needs for a program, its expected effects, activities, resources, stage of development and contex

“I think you should be more explicit here in Step Two.” By Sidney Harris, Copyright 2007, The New Yorker

Step 3 - Focus the evaluation design What do you want to know? Consider the purpose, uses, questions, methods, roles, budgets, deliverables etc. An evaluation cannot answer all questions for all stakeholders. Consider political viability, resources, practical procedures, etc.

Step 4 - Gather credible evidence Evidence must be believable, trustworthy, and relevant Information scope, sources, quality, logistics Methodology & data collection Who is studied and when What counts as evidence? Must have evidence seen as trustworthy & relevant Depends on questions asked and stakeholders’ views Should be defensible and reliable Systematic information sampling design use of comparison groups timing and frequency of data collection issues of bias (sample and respondent)

Step 5 - “Justify” Conclusions Consider data: Analysis and synthesis - determine findings Interpretation - what do findings mean? Judgments - what is the value of findings based on accepted standards? Recommendations – - what claims can be made? - what are the limitations of your design? Linked to evidence & consistent with agreed upon values or standards

Step 6 - Use and share results Share lessons learned with stakeholders! Provide feedback, offer briefings. disseminate findings What steps will you take to disseminate findings? Provide feedback to stakeholders Schedule follow up meetings with users Plan, prepare, and follow through

Next Session – Moving from the abstract to the concrete Are you overwhelmed? Next Session – Moving from the abstract to the concrete