Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.

Slides:



Advertisements
Similar presentations
Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Advertisements

The Operational P The Operational Programme adopted by the European Commission The ESPON 2013 Programme EUROPEAN UNION Part-financed by the European Regional.
Role of CSOs in monitoring Policies and Progress on MDGs.
From Research to Advocacy
Mywish K. Maredia Michigan State University
ESRC/DfID Poverty Alleviation Conference 9/9/14
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
Regional Trajectories to the Knowledge Economy: A Dynamic Model IKINET-EURODITE Joint Conference Warsaw, May 2006.
Area Based Development Findings, Recommendations, Lessons Learnt “To know the road ahead, ask those coming back”
Evaluating Socio-Economic Development - the ever-evolving EVALSED Guide Elliot Stern Presentation to Conference: Evaluating Public Interventions.
CENTRAL EUROPE PROGRAMME SUCCESS FACTORS FOR PROJECT DEVELOPMENT: focus on activities and partnership JTS CENTRAL EUROPE PROGRAMME.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
RBM Communications Assessment Challenges and Opportunities in Ghana, Mali, Senegal, Tanzania and Uganda.
PPA 502 – Program Evaluation
Evidence Based Cohesion Policy Focus on performance incentives Thomas Tandskov Dissing Senior Adviser Ministry of Economics and Business Affairs Danish.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Health Systems and the Cycle of Health System Reform
MONITORING AND EVALUATION – A PERSISTENT CHALLENGE 78 th Session of the Evaluation Committee Rome, 5 September 2013.
Welcome Regional Skills Policy and Sector Skills Councils – An LSC Viewpoint 19 October Chris Minett Regional Skills Director.
Strategic Commissioning
Evaluation plans for programming period in Poland Experience and new arrangements Ministry of Infrastructure and Development, Poland Athens,
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
What makes a successful development project? Kristin Olsen IOD PARC
Chapter 9 Developing an Effective Knowledge Service
IPA Funds Programme Management sept Bölgesel Rekabet Edebilirlik Operasyonel Programı’nın Uygulanması için Kurumsal Kapasitenin Oluşturulmasına.
CRPD: Research Rosemary Kayess Social Policy Research Centre UNSW.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Result Orientation in Interreg CENTRAL EUROPE Annual Meeting, Luxemburg, 15 September 2015 Monika Schönerklee-Grasser, Joint Secretariat.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Regional Policy Veronica Gaffey Evaluation Unit DG Regional Policy International Monitoring Conference Budapest 11 th November 2011 Budapest 26 th September2013.
Regional Policy Result Orientation of future ETC Programes Veronica Gaffey Head of Evaluation & European Semester 23 April 2013.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Validated Self Evaluation of Alcohol and Drug Partnerships Evidencing Implementation: The Quality Principles – Care Inspectorate/The Scottish Government.
St. John’s, Antigua May What is STAP? In 1994, the GEF Instrument sets up STAP – “UNEP shall establish, in consultation with UNDP and the World.
Building Strong Library Associations | Sustaining Your Library Association BSLA Stakeholders Workshop Yaounde, Cameroon, April 2012 Managing Relationships.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
EVIDENCE BASED POLICY RECOMMENDATIONS – TAKE AWAY LESSONS ON HOW TO PROGRESS EFFECTIVE ALCOHOL EDUCATION BETSY THOM Drug and Alcohol Research Centre MIDDLESEX.
Future outlook and next steps for ESPON The ESPON 2013 Programme OPEN DAYS Bruxelles, 10 October 2007.
FAO Turkey Partnership Programme (FTPP) FAO Final Evaluation of the FTPP Summary for FTPP Programming Meeting, 14 December
Continual Service Improvement Methods & Techniques.
Better regulation in the Commission Jonathon Stoodley Head of Unit C.1 Evaluation, Regulatory Fitness and Performance Secretariat General of the European.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
URBACT IMPLEMENTATION NETWORKS. URBACT in a nutshell  European Territorial Cooperation programme (ETC) co- financed by ERDF  All 28 Member States as.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Community Score Card as a social accountability Approach Methodology and Applications March 2015.
Stages of Research and Development
Criteria for Assessing MHPSS Proposals Submitted through the CAP, CERF and HRF Funding Mechanisms to the Protection Cluster.
Technical Assistance on Evaluating SDGs: Leave No One Behind
Managing for Results Capacity in Higher Education Institutions
Gender statistics in Information and Communication Technology for Women’s Empowerment and Gender Equality Dorothy Okello, Annual.
GENDER STATISTICS IN INFORMATION AND COMMUNICATION
Gender Equality Ex post evaluation of the ESF ( )
ESF evaluation plans Jeannette Monier, Impact Assessment and Evaluation Unit, DG EMPL ESF EVALUATION PARTNERSHIP MEETING 13 March 2015.
Ex-ante evaluation: major points and state of play
Community engagement and co-design
Raising the bar Meeting Europe’s future challenges
Support Tools for ESF Evaluation
Guidelines on the Mid-term Evaluation
The approved ESPON 2013 Programme
Assessing the Relevance of Global and Regional Partnership Programs (GRPPs) Chris Gerrard Global Programs Coordinator, IEG November 13,
Project intervention logic
Presentation transcript:

Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015

Good Evaluation Planning When we look at evaluations of Structural Fund supported interventions we find many that are good and many that are very poor This can often be traced back to the quality of evaluation planning – a clearly thought out and well documented planning process makes a difference

Good Evaluation Planning ‘Good’ evaluations are those that are: Useful and usable – relevant and understandable Technically and methodologically appropriate Suited to the programme concerned- differentiating tourism or infrastructure or enterprise related programmes Well communicated to potential users – managing authorities and stakeholders ‘Poor’ evaluations lack these qualities

Good Evaluation Planning Commission Guidance indicates that an Evaluation Plan should include: Objectives, coverage and coordination - & limiting/focusing interventions – ensuring evaluability Specification of evaluation responsibilities, processes and partner involvement Availability of data and data sets Expertise and provisions for evaluation independence Use and communication of evaluation Quality management strategy Focus and rationale for the evaluation concerned Timetable and budgets

Good Evaluation Planning An evaluation plan is not free standing It is embedded in the programme cycle reflecting : National and regional strategies and priorities European strategies and priorities - ‘smart, sustainable and inclusive growth’ Making comparisons and aggregation possible It provides a basis for: Assessing and addressing data needs Drawing up Evaluation Terms of Reference Selecting evaluation contractors Judging evaluation quality Translating evaluation outputs into recommendations & practical actions

Good Evaluation Planning Putting together an Evaluation Plan – who needs to be involved? Managing authorities and member states who need assistance to make judgements about evaluation quality Evaluation managers and commissioners who need tools to improve the quality of ongoing evaluations Policy makers who will need to make judgements about the robustness and credibility of evaluation evidence when using evaluation findings to develop future policies; and Evaluation practitioners who need clear statements of evaluation quality expectations and standards in order to improve their own practice

Good Evaluation Planning Programming period emphasises Results and Impacts. This is not easy! Identifying impacts – construct validity – hence importance of consulting and involving beneficiaries Disentangling multiple causes and effects Distinguishing what results can be attributed to SF interventions or other influences Recognising contextual influences and scope for generalisation Matching time trajectory of evaluations & programme results which differ e.g. enterprise support versus infrastructure development - & are often ‘emergent’

Good Evaluation Planning Commission Guidance identifies two families of evaluation approach for Impact Evaluations: Theory based evaluations which focus on how and why programmes have an effect – that depend on opening up the ‘back box’ and identifying causal mechanisms (intervention logics) and contextual factors Counterfactually based evaluations which compare what would have happened with or without the intervention by using control groups & statistical analysis

Good Evaluation Planning Operationally the planning of Impact Evaluations usually requires combining designs and methods: Qualitative and quantitative Different designs – statistical, experimental, theory based, participatory Choice of designs and methods depends on three considerations: Evaluation questions Characteristics of programmes Capability of designs and methods

Good Evaluation Planning This can be summed up in a ‘design triangle’:

Good Evaluation Planning Typically in results- oriented or impact evaluations we can ask different kinds of Evaluation Questions To what extent can a specific impact be attributed to the intervention? – counterfactual question Did the intervention make a difference? – a contribution question How much of a difference did the intervention make – a statistical question How has the intervention made a difference? – an explanatory question Will the intervention work elsewhere? – a generalisability question

Good Evaluation Planning But what the ‘design triangle’ suggests is that some questions are appropriate in some circumstances but not others, for example: If a programme is implemented differently in different settings asking a question like ‘did it work?’ is too simple – we need to ask how did it work in different contexts and why If a programme involves very few cases as with support for large enterprises then statistical analysis may not be possible When there is much prior policy experience and even pre-existing theory then it is easier to test programme theory – otherwise an evaluation also has to develop its own theory through an iterative design and tracking the implementatyion process

Good Evaluation Planning

Overall research has shown that programme interventions seldom ‘work’ in isolation. They usually interact with other programmes & policies, particular local or institutional resources, historical ‘assets’ and cultural and attitudinal predispositions. Hence the importance of newer methodologies and designs like theory based, realist and contribution analysis At the very least this makes methodological choices more demanding and poses challenges to identify evaluation skills that are often in scarce supply. Evaluation capacity development and skill recruitment is therefore an important aspect of evaluation planning

Good Evaluation Planning Overall research has shown that programme interventions seldom ‘work’ in isolation. They usually interact with other programmes & policies, particular local or institutional resources, historical ‘assets’ and cultural and attitudinal predispositions. Hence the importance of newer methodologies and designs like theory based, realist and contribution analysis At the very least this makes methodological choices more demanding and poses challenges to identify evaluation skills that are often in scarce supply. Evaluation capacity development and skill recruitment is therefore an important aspect of evaluation planning

Good Evaluation Planning In summary: Good evaluations require good evaluation plans A good evaluation plan requires an planning process that involves users and stakeholders in a partnership relationship The characteristics and ambition of SF interventions and the focus on ‘results’ and ‘impacts’ have made evaluation designs more methodologically challenging This requires skills, evaluation capacity development, data availability & careful design of programmes to be evaluable Careful evaluation planning ensures evaluation quality, use and policy relevant knowledge accumulation which is why it is important to devote time and effort to the evaluation planning process