Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Slides:



Advertisements
Similar presentations
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
Advertisements

1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Explanation of slide: Logos, to show while the audience arrive.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Mywish K. Maredia Michigan State University
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Session Four: M&E System for AfT bankable projects UNITED NATIONS Economic and Social Commission for Western Asia (ESCWA) Expert Group Meeting on Monitoring.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Results-Based Management: Logical Framework Approach
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Types of Evaluation.
New frontiers Evaluation methods Theory of change Project cycle and risk management Jesper Johnsøn, CMI, U4 Bergen, February 4, 2014.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
Program Evaluation Using qualitative & qualitative methods.
Results-Based Management
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 10. Monitoring and evaluation
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Why Evaluate? Evaluating the Impact of Projects and Programs, Beijing, China April Shahid Khandker World Bank Institute.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Applying impact evaluation tools A hypothetical fertilizer project.
Measuring IT Innovation Benefits. Evaluation of IT innovation Why measure IT benefits? A new IS/IT system is an investment; it must be strategically/financially.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Randomized Assignment Difference-in-Differences
Overview of evaluation of SME policy – Why and How.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Evaluation What is evaluation?
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
M&E Basics Miguel Aragon Lopez, MD, MPH
Food and Agriculture Organization of the United Nations
Quasi Experimental Methods I
Right-sized Evaluation
Quasi Experimental Methods I
Tracking development results at the EIB
Monitoring and Evaluation of Postharvest Training Projects
Evaluating Impacts: An Overview of Quantitative Methods
Sampling for Impact Evaluation -theory and application-
Title Team Members.
Presentation transcript:

Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14

I am cutting rocks The essence of theory of change – linking activities to intended outcomes I am building a temple

Theory of change “the process through which it is expected that inputs will be converted to expected outputs, outcome and impact” DfID Further Business Case Guidance “Theory of Change”

Theory of change Start with a RESULTS CHAIN

The results chain: tips Activities OutputsOutcomes We produce Influence Contribute to We control Clients Clients control We are accountable for We expectShould occurr 100% attribution Some attribution Partial attribution Readily changed Less flexibility to change Long term Delivered annually By end of program Long-term

Monitoring – activities and outputs

Personal Monitoring Tools

No monitoring - blind and deaf

Monitoring and Evaluation MonitoringEfficiency  Measures how productively inputs (money, time, personnel, equipment) are being used in the creation of outputs (products, results)  An efficient organisation is one that achieves its objectives with the least expenditures of resources Evaluation Effectiveness  Measures the degree to which results / objectives have been achieved  An effective organisation is one that achieves its results and objectives

10 All Most Some Inputs Outputs Outcomes MONITORING focused on project process (per individual project) EVALUATION focused on effectiveness of project process (for many projects)  Resources  Staff  Funds  Facilities  Supplies  Training  Project deliverables achieved  “Count” (quantified) what has been done  Short and intermediate effects.  Long term effects and changes

Resist temptation, there must be a better way! Clear objectives Few key indicators Quick simple methods Existing data sources Participatory method Short feed-back loops Action results!

Monitoring/Evaluation objectives must be SMART Specific Measurable Achievable Realistic Timed (see 10 Easy Mistakes, page 5)

Evaluation: who evaluates whom? The value of a joint approach

The Logical Chain 1.Define Objectives (and Methodology) 2.Supply Inputs 3.Achieve Outputs 4.Generate Outcome 5.Identify and Measure Indicators 6.Evaluate by comparing Objectives with Indicators 7.Redefine Objectives (and Methodology)

Impact Evaluation An assessment of the causal effect of a project, program or policy beneficiaries. Uses a counterfactual… Impacts = Outcomes - What would have happened anyway

When to use Impact Evaluation? Evaluate impact when project is: Innovative Replicable/ scalable Strategically relevant for reducing poverty Evaluation will fill knowledge gap Substantial policy impact Use evaluation within a program to test alternatives and improve programs

Impact Evaluation Answers What was the effect of the program on outcomes? How much better of the beneficiaries because of the program policies? How would outcome change if changed program design? Is the program cost-effective?

Different Methods to measure impact evaluation  Randomised Assignment – experimental Non Experimental:  Matching  Difference-in-Difference  Regression Discontinuity Design  Instrumental Variable / Random Promotion

Randomization The “gold standard” in evaluating the effects of interventions It allows us to form a “treatment” and “control” groups – identical characteristics – differ only by intervention Counterfactual: randomized-out group

Matching Matching uses large data sets and heavy statistical techniques to construct the best possible artificial comparison group for a given treatment group. Selected basis of similarities in observed characteristics Assumes no selection bias based on unobservable characteristics. Counterfactual: matched comparison group

Difference-in-difference Compares the change in outcomes overtime between the treatment group and comparison group Controls for the factors constant overtime in both groups ‘parallel trends’ in both the groups in the absence of the program Counter-factual: changes over time for the non- participants

DesignWhen to use Randomization ► Whenever possible ► When an intervention will not be universally implemented Random Promotion ► When an intervention is universally implemented Regression Discontinuity ► If an intervention is assigned based on rank Diff-in-diff ► If two groups are growing at similar rates Matching ► When other methods are not possible ► Matching at baseline can be very useful Uses of Different Design

Qualitative and Quantitative Methods Qualitative methods focus on how results were achieved (or not). They can be very helpful for process evaluation. It is often very useful to conduct a quick qualitative study before planning an experimental (RCT) study.

Thank you!