Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS.

Slides:



Advertisements
Similar presentations
Poverty Monitoring - Good Practice in Selecting Indicators Presentation by Dirk U. Hahn, Consultant at the Workshop on Prioritization, Operationalization.
Advertisements

1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Implementing NICE guidance
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Baseline for school surveys - Young Lives longitudinal survey of children, households & communities every 3 years since ,000 children Ethiopia,
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Results-Based Management: Logical Framework Approach
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
M&E Issues: RAFIP and REP Kaushik Barua Accra, 12 Dec
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
1 Designing a Monitoring and Evaluation System for a Rural Travel and Transport Project Michael Bamberger Gender and Development Group The World Bank RTTP.
Standards and Guidelines for Quality Assurance in the European
Poverty Reduction Information Systems (PRIS)
Information Technology Audit
The Joint Strategic Plan for Older People An overview.
A big picture of the curriculum Adapted with thanks to colleagues at the Council for Curriculum, Examinations and Assessment (CCEA) Working draft: With.
Raising Attainment Evidence and Challenges Jim Cameron, Head of Schools with Education Support West Lothian Council.
Miyo Wahkohtowin Community Education Authority Maskwacis Student Success Program Presented by Ahmad Jawad March 8, 2011.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
Families as Partners in Learning Principals and teaching staff Why are partnerships important?
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
M&E progress in EFA Goals Prepared by Nyi Nyi THAUNG, UIS (Bangkok) Capacity Building Workshop on Monitoring and Evaluating Progress in Education in the.
All images © Mat Wright Ensuring quality – what can be learnt from the UK and the rest of Europe? Santiago June 2014 Geoff Fieldsend.
Framework for Monitoring Learning & Evaluation
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Monitoring global progress towards education goals Issues for the future Manos Antoninis Global EFA Meeting Muscat, 12 May 2014.
ADB/ ECA/ PARIS21 – NSDS design seminar, Addis Ababa, 8-11 August 2005 National Strategies for the Development of Statistics Planning for implementation,
European Conference on Quality in Official Statistics Session 26: Census 2011 « Helsinki, 6 May 2010 « Census quality control with BSC: the Portuguese.
Measuring Risk, Vulnerability and Impact in Conflict: Doing Better Through Data Collaboration What is DFID doing in Afghanistan?
Monitoring and Evaluation: Good Practices, Common Challenges, and New Strategies Maureen Jaffe & Misty Heggeness Monitoring and Evaluation Team May 31,
Presented by: Shubha Chakravarty (Economist, AFTPM) Impact Evaluation team: Mattias Lundberg (Sr. Economist, HDNCY) Markus Goldstein (Sr. Economist, AFTPM.
Impact Measurement why what how Atlanta. Today Imperatives Questions Why Now? Significant Challenges Breakthroughs in the field CARE’s Long-Term.
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
Mindset 2000 LtdSlide 1 Train to Gain Provider Support Programme October 2007 Self assessment - introduction.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Census quality evaluation: Considerations from an international perspective Bernard Baffour and Paolo Valente UNECE Statistical Division Joint UNECE/Eurostat.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Development of Gender Sensitive M&E: Tools and Strategies.
Assessing Logistics System Supply Chain Management 1.
An Overview of CLP-2’s M&E System. What was our baseline?
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Evaluation What is evaluation?
Dr. Anubha Rajesh, Early Education Services, ICF January 13, 2016 Invest in Young Children: National Conference on Mother- tongue Based Multilingual Early.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Project monitoring and evaluation
Impact evaluations at IFAD-IOE
Evaluation of Nutrition-Sensitive Programs*
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Tracking development results at the EIB
Draft Methodology for impact analysis of ESS.VIP Projects
Sampling for Impact Evaluation -theory and application-
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Integrating Gender into Rural Development M&E in Projects and Programs
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS

Structure 2 | UKFIET 2015 Conference Challenges with the approach -In school vs. out-of-school approaches -Enumerators and testing -PbR + Accountability vs. flexibility -Cost and capacity 1 2 The GEC approach to M&E -Background -Focus on outcomes and PbR -Quantitative benchmark for achievement -Mixed methods for outputs -The evaluator model

The GEC, 37 projects working in 18 countries 3 | UKFIET 2015 Conference

The GEC has three major outcomes… 4 | UKFIET 2015 Conference ALL ATTENDANCE Girls that stay and attend school throughout life cycle of the project LEARNING Girls supported by GEC with improved learning outcomes LEVERAGE Additional funds secured & sustainability mechanisms established PbR Indicators: Attendance Net enrolment PbR Indicators: Literacy Numeracy Non-PbR Indicators: Match funding Influence on policy and community perceptions

A rigorous approach, means control groups… 5 | UKFIET 2015 Conference Time Learning Outcomes Control Group Intervention Group Counterfactual Intervention Group GEC Intervention Commences Control Group Trajectory Intervention Group Trajectory Outcomes are defined as ‘additional’ learning

With a consistent approach to target setting… 6 | UKFIET 2015 Conference y 75 th percentile μ Frequency Score y μ 75 th percentile Frequency Score Distribution of test scores for girls in one grade above in baseline Expected distribution of girls in the intervention group at midline The target is the expected improvement on the learning test over and above control group y μ = Mean z Frequency Score σ σ Y = 0.2 SD

Evidence supports the focus on learning... 7 | UKFIET 2015 Conference EGRA words per minute (wpm) for Grade 5 girls. Selected GEC projects + estimate from United States, compared to estimate of speed required for comprehension

8 | UKFIET 2015 Conference But also a mixed methods approach… IMPACTLONG TERM STUDY OUTCOMES OUTCOME SPREADSHEET & MID / ENDLINE OUTPUTLOGFRAME INPUTS MILESTONES & VfM TABLES Improved Employment, Reduced Child Marriage etc. Learning Improved teaching skills Teacher training Improved access to materials Distribution of textbooks Attendance Access to appropriate finance Bursaries Safe mode of transport Van / Bus Economy Efficiency Effectiveness

9 | UKFIET 2015 Conference The independent, External Evaluator: –Helps to design and finalise tools –Collects data on learning indicator and information on attendance –Conducts HHS –Conducts qualitative research including output indicators if requested –Produces data and draft report The project: –Contracts the evaluation –Helps design and finalise tools –Feeds in theory of change to evaluation –Conducts own qualitative research including informing logframe indicators, as well as own attendance spot checks –Quality controls the outputs from the evaluator And projects must be independently evaluated

Part two: Challenges 10 | UKFIET 2015 Conference 1 2 Challenges with the approach -In school vs. out-of-school approaches -Enumerators and testing -PbR + Accountability vs. flexibility -Cost and capacity The GEC approach to M&E -Background -Focus on outcomes and PbR -Quantitative benchmark for achievement -Mixed methods for outputs -The evaluator model

11 | UKFIET 2015 Conference GEC projects employ a longitudinal study, tracking a cohort of girls across 3 years, representative of their beneficiary population. Finding the same girls is difficult, if OOS… Projects have done so using: –The cohort via a household survey –The cohort via schools There are large statistical benefits to tracking a cohort – reduces standard errors + allows for other factors to be used in regressions BASELINEMIDLINEENDLINE

12 | UKFIET 2015 Conference Tests are not perfect, particularly at higher levels.. Challenges have arisen over the course of the baseline. These include: Complexity of enumeration and data collection: –More training of enumerators at baseline would have been beneficial. –EGRA – the use and reporting of the timed dimensions of the test. –EGMA – the need to weight sections equally. –ASER / UWEZO – scoring tests clearly given a binary, level-based approach to assessments. The findings of ceiling or floor effects –Adaptations have been needed and re-testing where “ceiling” or “floor” effects have been found.

PbR challenges: accountability vs flexibility… 13 | UKFIET 2015 Conference

14 | UKFIET 2015 Conference To ensure complete and good quality data, and that enumerators and data processors (e.g. data entry clerks and translators) are collecting and processing data in the correct way, fieldwork monitoring and data quality assurance should always be divided into three key components: 1.Pre-fieldwork processes 2.Fieldwork 3.Post-fieldwork processes Investing in all three processes may save time and resources in the future. MANY GEC PROJECT EVALUATORS HAVE LACKED CAPACITY AT THESE STAGES. Cost and capacity constraints for fieldwork…

15 | UKFIET 2015 Conference Respondent Fatigues Attendance data -Accuracy -Spot checks Organisational Capacity -Enumeration and -data issues PbR related Risks - Stat significance Ethical Concerns - Control groups Fragile States -Security -Access Balance between evaluation and implementation So, a host of challenges, this is not easy….

16 | UKFIET 2015 Conference 80,000 The amount of girls in the full GEC cohort (approx.) Outcomes model VfM methodology used across other DFID projects Quantitative and qualitative data contributing towards detailed understanding on the context and education barriers in developing countries But also some achievements!