Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.

Slides:



Advertisements
Similar presentations
Il Project Cycle Management :A Technical Guide The Logical Framework Approach 1 1.
Advertisements

Introduction to Monitoring and Evaluation
Study Objectives and Questions for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Mywish K. Maredia Michigan State University
Patricia Rogers, RMIT University, Melbourne Part 3 of 8 AEA Coffee Break Webinars 2013 FRAME what is to be evaluated.
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
Reviewing and Critiquing Research
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
Return On Investment Integrated Monitoring and Evaluation Framework.
PPA 502 – Program Evaluation
Results-Based Management: Logical Framework Approach
Challenge Questions How good is our operational management?
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Simon Hearn, ODI, London Part 2 of 8 AEA Coffee Break Webinars 2013 DEFINE what is to be evaluated.
How to Develop the Right Research Questions for Program Evaluation
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Qualitative Research.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Workshop 6 - How do you measure Outcomes?
 What is Public Relations Research? Research is important and thus it is the key to a successful Public Relations programme. Research assists in gathering.
Public Health Advocacy in Low Income Settings: Views and Experiences on Effective Strategies and Evaluation of Health Advocates in Malawi IFGH Conference:
The Results-Based System Awoke Kassa ENTRO M&E Officer ENTRO M&E Officer SDCO Capacity Building Workshop IV October 2008 Cairo Nile Basin Initiative.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Evaluation design and implementation Puja Myles
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Session 2: Developing a Comprehensive M&E Work Plan.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
Sociology. Sociology is a science because it uses the same techniques as other sciences Explaining social phenomena is what sociological theory is all.
Measuring Institutional Capacity for Sustainability Mark D. Bardini, Ph.D. Chemonics International AEA Webinar September 15, 2011.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Stages of Research and Development
Designing Effective Evaluation Strategies for Outreach Programs
Fundamentals of Monitoring and Evaluation
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Outcome Harvesting nitty- gritty Promise and Pitfalls of Participatory Design Atlanta, 10:45–11:30 29 October, 2016.
Module 8- Stages in the Evaluation Process
CATHCA National Conference 2018
Regulated Health Professions Network Evaluation Framework
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Presentation transcript:

Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK

The nature of the intervention: 1.Focus of objectives 2.Governance 3.Consistency of implementation What do we mean by complex interventions? How it works: 4.Necessariness 5.Sufficiency 6.Change trajectory Photo: Les Chatfield -

What are the challenges of evaluating complex interventions? Describing what is being implemented Getting data about impacts Attributing impacts to a particular programme Photo: Les Chatfield -

Wikipedia: Evaluation Methods Why a framework is needed

Image: Simon Kneebone. Why a framework is needed

The Rainbow Framework

DEFINE what is to be evaluated

Why do we need to start with a clear definition? Photo: Hobbies on a Budget / Flickr

1.Develop initial description 2.Develop program theory or logic model 3.Identify potential unintended results

Options for representing logic models Pipeline / results chainLogical frameworkOutcomes hierarchy / theory of changeRealist Matrix

FRAME what is to be evaluated

Source: Hobbies on a Budget / Flickr Frame Decision Make Decision Frame Evaluation Design Evaluation

1.Identify primary intended users 2.Decide purpose(s) 3.Specify key evaluation questions 4.Determine what ‘success’ looks like

DESCRIBE what happened

1.Sample 2.Use measures, indicators or metrics 3.Collect and/or retrieve data 4.Manage data 5.Combine qualitative and quantitative data 6.Analyze data 7.Visualize data

Combine qualitative and quantitative data Enrich Examine Explain Triangulate Parallel Sequential Component Integrated

UNDERSTAND CAUSES of outcomes and impacts

Outcomes Impacts

As a profession, we often either oversimplify causation or we overcomplicate it!

“In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to.” Comment in an expert discussion on The Guardian online, May 2013

1.Check that the results support causal attribution 2.Compare results to the counterfactual 3.Investigate possible alternative explanations

SYNTHESIZE data from one or more evaluations

Was it good? Did it work? Was it effective? For whom did it work? In what ways did it work? Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line?

24 How do we synthesize diverse evidence about performance? All intended impacts achieved   Some intended impacts achieved  No negative impacts   Overall synthesis GOOD?? BAD

1.Synthesize data from a single evaluation 2.Synthesize data across evaluations 3.Generalize findings

REPORT and SUPPORT USE of findings

I can honestly say that not a day goes by when we don’t use those evaluations in one way or another

1.Identify reporting requirements 2.Develop reporting media 3.Ensure accessibility 4.Develop recommendations 5.Support use

MANAGE your evaluation

1.Understand and engage with stakeholders 2.Establish decision making processes 3.Decide who will conduct the evaluation 4.Determine and secure resources 5.Define ethical and quality evaluation standards 6.Document management processes and agreements 7.Develop evaluation plan or framework 8.Review evaluation 9.Develop evaluation capacity

DESCRIBE UNDERSTAND CAUSES SYNTHESIZE REPORT & SUPPORT USE Descriptive Questions – Was the policy implemented as planned? Causal questions – Did the policy change contribute to improved health outcomes? Synthesis questions – Was the policy overall a success? Action questions – What should we do? Making decisions Look at type of questions

Making decisions Compare pros and cons

Participant Questionnaire Key Informant Interviews Project Records Observation of program implementation KEQ1 What was the quality of implementation? ✔✔✔✔ KEQ2 To what extent were the program objectives met? ✔✔✔ KEQ3 What other impacts did the program have? ✔✔ KEQ4 How could the program be improved? ✔✔ ✔ Making decisions Create an evaluation matrix

Examples Descriptions Tools Guides Comments R & D Documenting Sharing Events

Founding Partners Financial Supporters

For more information :