Workshop at the Cairo conference on Impact Evaluation 29 March 2009

Slides:



Advertisements
Similar presentations
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Advertisements

Results Based Monitoring (RBM)
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Mywish K. Maredia Michigan State University
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Designing an Effective Evaluation Strategy
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Project Monitoring Evaluation and Assessment
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Ray C. Rist The World Bank Washington, D.C.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Historical Research.
Measuring Learning Outcomes Evaluation
Standards and Guidelines for Quality Assurance in the European
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Qualitative Research.
Results-Based Management
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
AGENDA 09/09 & 09/10 F Nature of Strategic Challenge & F Strategic Management F The Strategy Concept and Process F Strategic Plan - Team Meetings.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Outcome Based Evaluation for Digital Library Projects and Services
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
The Results-Based System Awoke Kassa ENTRO M&E Officer ENTRO M&E Officer SDCO Capacity Building Workshop IV October 2008 Cairo Nile Basin Initiative.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 Why evaluate information technology investments? Pertemuan Matakuliah: A Evaluasi Efektivitas Sistem Informasi Tahun: 2006.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Monitoring and Evaluation
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
SCIENCE The aim of this tutorial is to help you learn to identify and evaluate scientific methods and assumptions.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
OED Perspective on ICR Quality Soniya Carvalho, OED Quality At Entry Course on SFs/CDD April 13, 2005 * Contributions from OED’s ICR Review Panel members.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Module 2 Basic Concepts.
Strategic Planning for Learning Organizations
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
CATHCA National Conference 2018
Changing the Game The Logic Model
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Workshop at the Cairo conference on Impact Evaluation 29 March 2009 Monitoring? Evaluation? Impact Evaluation? Appreciating and Taking Advantages of the Differences Workshop at the Cairo conference on Impact Evaluation 29 March 2009 Burt Perrin La Masque Burt@BurtPerrin.com 30770 Vissec FRANCE +33 4 67 81 50 11

Putting the “and” back in MandE Alternative title: Putting the “and” back in MandE

Plan for the workshop Participative approach – small group exercises, your real-world examples, general discussion Consider differences between monitoring and evaluation Strengths and limitations of each Use and misuse of performance indicators How to use monitoring and evaluation approaches appropriately and in a complementary fashion What is “impact evaluation” and where does it fit in?

What do we mean by Monitoring, and by Evaluation?

Monitoring – the concept and common definitions Tracking progress in accordance with previously identified objectives, indicators, or targets (plan vs. reality) RBM, performance measurement, performance indicators … En français: “suivi” vs. “contrôle” Some other uses of the term Any ongoing activity involving data collection and performance (usually internal, sometimes seen as self evaluation)

Evaluation – some initial aspects Systematic, data based Often can use data from monitoring as one source of information Can consider any aspect of a policy, programme, project Major focus on assessing the impact of the intervention (i.e. attribution, cause) E - valua - tion

Frequent status of M&E monitoringandevaluation Monitoringandevaluation RBM (Monitoring) Evaluation or

Ideal situation – Monitoring and Evaluation complementary

Monitoring and Evaluation Generally episodic, often external Can question the rationale and relevance of the program and its objectives Can identify unintended as well as planned impacts and effects Can address “how” and “why” questions Can provide guidance for future directions Can use data from different sources and from a wide variety of methods Monitoring Periodic, using data routinely gathered or readily obtainable, generally internal Assumes appropriateness of programme, activities, objectives, indicators Tracks progress against small number of targets/ indicators (one at a time) Usually quantitative Cannot indicate causality Difficult to use for impact assessment

MONITORING, EVALUATION AND IA Investments (resources, staff…) and activities Products Immediate achievements of the project Long-term, sustainable changes Inputs Outputs Outcomes Impact Monitoring: what has been invested, done and produced, and how are we progressing towards the achievement of the objectives? Evaluation: what occurred and what has been achieved as a result of the project? Impact assessment: what long-term, sustainable changes have been produced (e.g. the contribution towards the elimination of child labour)?

Evaluation vs. Research Primary objective: knowledge generation Evaluation reference to a particular type of situation Utilisation in some form an essential component But: evaluation makes use of research methodologies

Monitoring data: quantitative only, or also qualitative? Some/most guidelines specify quantitative only Some nominally allow qualitative information, but: Indicator Q1 Q2 Q3 Q4 Yr

Performance Indicators A consideration of their limitations and potential for misuse See, for example: Burt Perrin, Effective Use and Misuse of Performance Measurement, American Journal of Evaluation, Vol. 19, No. 3, pp. 367-369, 1998. Burt Perrin, Performance Measurement: Does the Reality Match the Rhetoric? American Journal of Evaluation, Vol. 20, No. 1, pp. 101-114, 1999.

Common flaws, limitations, and misuse of performance indicators - 1 Goal displacement Terms and measures interpreted differently Distorted or inaccurate data Meaningless and irrelevant data Cost shifting vs. cost savings Critical subgroup differences hidden

Common flaws, limitations, and misuse of performance indicators -2 Do not take into account the larger context/complexities Limitations of objective-based approaches to evaluation Useless for decision making and resource allocations Can result in less focus on innovation, improvement and outcomes

The process of developing indicators – should include: Involvement of stakeholders Development, interpretation and revision of indicators Allocation of time and resources to the development of indicators Provision of training and expertise Thinking about potential forms of misuse in advance Pretesting, testing, review and revision

Using indicators appropriately – some basic strategic considerations First, do no harm Meaningful and useful at the grassroots – the program, staff, local stakeholders NOT linked to budget allocations or managerial rewards Use only when makes sense, e.g. Mintzberg, Pollitt/OECD: Standardised programmes – recurrent products/services Established programmes with a basis for identifying meaningful indicators and targets NOT for tangible individual services NOT for non-tangible ideal services

Using indicators appropriately –strategic considerations – 2 Use indicators as indicators At best, a window vs. reality To raise questions rather than to provide the “answer” Different levels (e.g. input, activities, outputs, outcomes where it makes sense)

Using indicators appropriately –strategic considerations – 3 Focus on results vs. busy-ness Performance information vs. performance data Descriptive vs. numerical indicator Performance MANAGEment vs. MEASUREment (original intent diverted from management to control) Periodically review overall picture – ask if the “data” makes sense, identify questions arising Indicators as part of a broad evaluation strategy

Using indicators appropriately – operational considerations Look at subgroup differences Indicators/targets indicating direction vs. assessing performance If latter, don’t set up programme for failure Dynamic vs. static Never right the first time Constantly reassess validity and meaningfulness Pre-test, pre-test, pre-test Update and revise Provide feedback – and assistance as needed

Using indicators appropriately - reporting More vs. less information in reports Performance story vs. list of numbers Identify limitations – provide qualifications Combine with other information Request/provide feedback

Evaluation

A strategic approach to evaluation Raison d’être of evaluation Social betterment Sensemaking More generally, raison d’être of evaluation To be used! Improved policies, programmes, projects, services, thinking

Monitoring and Evaluation Periodic, using data routinely gathered or readily obtainable Assumes appropriateness of programme, activities, objectives, indicators Tracks progress against small number of targets/ indicators (one at a time) Usually quantitative Cannot indicate causality Difficult to use for impact assessment Evaluation Generally episodic Can question the rationale and relevance of the program and its objectives Can identify unintended as well as planned impacts and effects Can provide guidance for future directions Can address “how” and “why” questions Can use data from different sources and from a wide variety of methods

Future orientation - Dilemma “The greatest dilemma of mankind is that all knowledge is about past events and all decisions about the future. The objective of this planning, long-term and imperfect as it may be, is to make reasonably sure that, in the future, we may end up approximately right instead of exactly wrong.”

Questions for evaluation Start with the questions Choice of methods to follow How to identify questions Who can use evaluation information? What information can be used? How? Different stakeholders – different questions Consider responses to hypothetical findings Develop the theory of change (logic model)

The three key evaluation questions What’s happening? (planned and unplanned, little or big at any level) Why? So what?

Some uses for evaluation Programme improvement Identify new policies, programme directions, strategies Programme formation Decision making at all levels Accountability Learning Identification of needs Advocacy Instilling evaluative/questioning culture

Different types of evaluation Ex-ante vs. ex-post Process vs. outcome Formative vs. summative Descriptive vs. judgemental Accountability vs. learning (vs. advocacy vs. pro-forma) Short-term actions vs. long-term thinking Etc.

Results chain Impact Outcomes Reach Outputs Processes Inputs

Intervention logic model

Generic logic model (simplified)

Generic logic model – in context

Making evaluation useful - 1 Be strategic E.g. start with the big picture – identify questions arising Focus on priority questions and information requirements Consider needs, preferences, of key evaluation users Don’t be limited to stated/intended effects Don’t try to do everything in one evaluation

Making evaluation useful - 2 Primary focus: how evaluation can be relevant and useful Bear the beneficiaries in mind Take into account diversity, including differing world views, logics, and values Be an (appropriate) advocate Don’t be too broad 42 Don’t be too narrow

How else can one practice evaluation so that it is useful? Follow the Golden Rule “There are no golden rules.” (European Commission) Art as much as science Be future oriented Involve stakeholders Use multiple and complementary methods, qualitative and quantitative Recognize differences between monitoring and evaluation

To think about … Constructive approach, emphasis on learning vs. punishment Good practices (not just problems) Take into account complexity theory, systems approach, chaos theory Synthesis, knowledge management Establishing how/if the intervention in fact is responsible for results (attribution or cause)

Impact evaluation/assessment: what does this mean? OECD/DAC definition of impact: Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. Development objective: Intended impact contributing to physical, financial, institutional, social, environ-mental, or other benefits to a society, community, or group of people via one or more development interventions. But beware! ‘Impact’ and ‘impact assessment’ frequently used in very different ways.

Determining attribution – some alternative approaches Experimental/quasi-experimental designs (randomisation) Eliminate rival plausible hypotheses Physical (qualitative) causality Theory of change approach “reasonable attribution” “Contribution” vs. “cause” Contribution analysis (simplest approach – at needed confidence)

Some considerations for meaningful impact evaluation Need information about inputs and activities as well as about outcomes Check, don’t assume that what is mandated in (Western) capitals is what actually takes place sur le terrain Check: are data sources really accurate? Dealing with responsiveness – a problem or a strength? Internal vs. external validity

Some questions about impact evaluation What is possible with multiple interventions? Changing situation Strategies/policies vs. projects Time frame?

Monitoring and Evaluation in Combination

How Monitoring and Evaluation can be complementary Ongoing monitoring Can identify questions, issues for (in-depth) evaluation Can provide data for evaluation Evaluation Can identify what should be monitored in the future

Monitoring vs. Evaluation Start with the purpose and question(s) E.g. control vs. learning/improvement Identify information requirements (for whom, how would be used …) Articulate the theory of change Use most appropriate method(s) given the above Some form of monitoring approach? and/or Some form of evaluation? Do not use monitoring when evaluation is most appropriate – and vice versa Consider costs (financial, staff time). timeliness Monitoring usually – but not always! – less costly and quicker

Mon. and Eval. in combination Multi-method approach to evaluation usually most appropriate – can include monitoring Generally monitoring most appropriate as part of an overall evaluation approach E.g. use evaluation to expand upon the “what” information from monitoring, and to address “why” and “so what” questions Strategic questions  strategic methods Seek minimum amount of information that addresses the right questions and that will actually be used Tell the performance story Take a contribution analysis approach

Contribution Analysis (Mayne: Using performance measures sensibly) Develop the results chain Assess the existing evidence on results Assess the alternative explanations Assemble the performance story Seek out additional evidence Revise and strengthen the performance story

Thank you / Merci pour votre participation. Conclusion Go forward, monitor and evaluate – and help to make a difference. Thank you / Merci pour votre participation. Burt Perrin Burt@BurtPerrin.com