Phase 2 Evaluation of the PD Evaluation Methodology Reference Group Workshop/Meeting 11 – 13 February 2009.

Slides:



Advertisements
Similar presentations
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Advertisements

Critical and Analytical Thinking Transition Programme
Multiple Indicator Cluster Surveys Data Dissemination - Further Analysis Workshop Basic Concepts of Further Analysis MICS4 Data Dissemination and Further.
Multiple Indicator Cluster Surveys Data Interpretation, Further Analysis and Dissemination Workshop Basic Concepts of Further Analysis.
Info Day on New Calls and Partner Café 8 September 2010 in Bruxelles Call for Proposals on Targeted Analyses.
Role of NSOs in Analysis John Cornish. Analysis underpins effective NSO operations Analysis is broad in extent, and it supports all phases of the production.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Ray C. Rist The World Bank Washington, D.C.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Phase 2 Evaluation of the PD Key Issues for Evaluation Design Reference Group Workshop/Meeting 11 – 13 February 2009.
Expert Group on In-depth Mutual Learning between National Policy Makers R&D Grants Challenges Presentation Professor Ammon Salter 30 June 2015.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Evaluation plans for programming period in Poland Experience and new arrangements Ministry of Infrastructure and Development, Poland Athens,
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO- Best Practices from Initial Evaluations” Geneva November, Evaluation Section Internal.
SECTOR POLICY SUPPORT PROGRAMMES A new methodology for delivery of EC development assistance. 1.
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
Evaluation of the Paris Declaration Phase 2 DAC Evaluation Network 15 June 2009 Niels Dabelstein.
Commissioning Self Analysis and Planning Exercise activity sheets.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha
2 Why do we need a “data revolution”? What do we want to achieve? How will we do it? Who should be involved? When will it be done?
1 1 The Global Project on Measuring the Progress of Societies OECD World Forum on Statistics, Knowledge and Policy Jon Hall, World Forum Project Leader,
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
1 Cohesion Policy Ongoing Evaluation Budapest, 7 May 2010 Veronica Gaffey Head of Evaluation DG for Regional Policy European Commission.
Has gender been mainstreamed in YOUR organization? CAPWIP MGGR Nov 2007.
Monitoring the Paris Declaration in 2011 Preliminary Findings Working Party on Aid Effectiveness Paris, 5-8 July 2011.
restricted external Evaluating the vinspired 24/24 programme Ewan King, director OPM 30 September
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
Joint Evaluation of the Paris Declaration, Phase 2Core Team Joint Evaluation of the Paris Declaration, Phase 2 Evaluation Framework & Workplan Presentation.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Global Accountancy Education Benchmarking Development for SEEPAD WB REPARIS, Vienna March 14, 2006 Accountancy Education in the Region Presentation by.
INFORMATION AND PROGRESS An analysis of what is happening in the Caribbean with information, decision- making and progress in Education.
Preparing to teach OCR GCSE (9-1) Geography B (Geography for Enquiring Minds) Planning, constructing and introducing your new course.
What is qualitative data analysis? Different approaches to analysing qualitative data.
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
Auditing Sustainable Development Goals
a New Focus for External Validity
PFM Reform Programmes Presentation by Mary Betley
ENTERPRISE FACULTY What is Enterprise?.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
A Guide to SDG Interactions: from Science to Implementation
UNECE Work Session on Gender Statistics, Belgrade,
INFORMATION AND PROGRESS
Business Environment
Business Environment
Business Environment
Business Environment
“CareerGuide for Schools”
Introduction to vital statistics report writing
Module 5 SDG follow-up and review mechanisms
Logic Models and Theory of Change Models: Defining and Telling Apart
UK role in Statistical Capacity Building
Session 4: SDG follow-up and review mechanisms
Online Session 4.2: Designing Methodologies and Data Collection
Online Session 4.1: Designing Methodologies and Data Collection
Evaluation plans for programming period in Poland
Critical and Analytical Thinking
Introduction on the outline and objectives of the workshop
Building Knowledge about ESD Indicators
Critical and Analytic Reading and Writing
The Estonian experience with ex-ante evaluation – set-up and progress
EVALUATIONS in the EU External Aid
ORGANIZING AND PRESENTING QUALITATIVE DATA
Steph Kirkham, Development Consultant, sparqs
Training module on anthropometric data quality
What makes a good essay? Preparing your students for C3.
Presentation transcript:

Phase 2 Evaluation of the PD Evaluation Methodology Reference Group Workshop/Meeting 11 – 13 February 2009

Methodological Issues The broad choice of evaluation methodology, as we discussed yesterday, follows on from the evaluation questions chosen. However there are more specific methods-related issues that need to be thought about at this stage.

Methodological Issues One way of focussing on methodology is to consider what methodological ‘standards’ or ‘qualities’ we expect this evaluation to have. (You will notice a certain overlap with the previous workshop on ‘quality standards’!)

Methodological Issues For example, it can be argued that this evaluation should end up with the following qualities: A balanced & sufficient sample of countries Sufficient coverage of sectors and themes Information of good quality Offering the possibility of explanation and attribution

Methodological Issues If an evaluation has these qualities, it will allow us to generalise to some extent across different PD settings (‘external validity’); be confident that our measurements or descriptions are consistent (‘reliability’) and have confidence in the strength or ‘power’ of findings – that they are sufficiently supported by the evidence collected and analysed

Methodological Issues Making the right decisions about methodology at the beginning will make an evaluation defensible, able to withstand criticism when it eventually reports Lets take each of the ‘attributes’ in turn

Methodological Issues ‘A balanced & sufficient sample of countries’ When policy makers ask about the PD they want to know whether we are able to say something about countries in different geographical regions; those that are more and less aid dependent; and both those that have strong institutions and governance and others that have some elements of ‘fragility’ perhaps because they are still recovering from wars or conflicts. We also need enough countries to be able to support conclusions for the PD as a whole

Methodological Issues ‘Sufficient coverage of sectors and themes’ Countries selected need to cover the main policy areas and sectors recognised as important for development – for example healthcare, encouraging small businesses, education, progress towards the MDGs, international trade support. They also need to include important themes such as: capacity development, civil society participation, donor harmonisation, improving governance & reducing fragility. This will allow sensible comparisons to be made across cases

Methodological Issues ‘Information of good quality’ This requires: Available information – one rationale for selecting some sectors/themes Willingness to use innovative data sources Ensuring that all pre-existing sources are ‘synthesised’, reviewed and exploited Cross-checking (‘triangulating’) across multiple sources of information Expending more time relatively on data collection

Methodological Issues ‘Offers the possibility of explanation & attribution’ There are two ‘classic’ ways we can attempt to explain. First through longitudinal analyses that follow a causal chain over time – this is the basis of time-series data and panel studies as well as causal modelling, tracker-studies & ‘theory-based evaluations’. Second we can compare across places, settings or time periods – including before & after studies; comparison groups; quasi experiments and full controlled experiments

Methodological Issues On this basis, we can put together a possible starting list of methods that could be used in Phase 2. The list would include:

Methodological Issues Synthesis reviews of existing evaluations, research and indicator systems Comparative in-depth case studies (of country partnerships) which are chosen to contain a good cross-section of common themes/sectors Longitudinal studies – either forward looking (‘theory- based’ mapping of plausible directions of travel) or backward looking tracking back to PD-like, longer established policies Targeted comparative studies to ‘supplement’ country based case comparisons

Methodological Issues Methodologies have to be understood as more than analytic tools. How they are resourced and ‘steered’ will determine their value as much as their technical sophistication. To take two examples:

Methodological Issues Putting together the best team of experts to undertake the evaluation at country & central levels will be challenging. It may require bringing together public sector and civil society expertise; national and possibly regional resources & skills.

Methodological Issues National ‘reference groups’ will need to open up access for information and cooperation institutionally and across government; to safeguard the independence and credibility of the evaluation; & build bridges so as to make it more likely that evaluation outputs will be used and useful

Methodological Issues These are some of the issues that needs more discussion before the Terms of Reference for this evaluation are prepared – and will therefore be taken up in the group discussion session that follows ………..after any points of clarification