Presentation is loading. Please wait.

Presentation is loading. Please wait.

Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.

Similar presentations


Presentation on theme: "Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015."— Presentation transcript:

1 Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015

2 Good Evaluation Planning When we look at evaluations of Structural Fund supported interventions we find many that are good and many that are very poor This can often be traced back to the quality of evaluation planning – a clearly thought out and well documented planning process makes a difference

3 Good Evaluation Planning ‘Good’ evaluations are those that are: Useful and usable – relevant and understandable Technically and methodologically appropriate Suited to the programme concerned- differentiating tourism or infrastructure or enterprise related programmes Well communicated to potential users – managing authorities and stakeholders ‘Poor’ evaluations lack these qualities

4 Good Evaluation Planning Commission Guidance indicates that an Evaluation Plan should include: Objectives, coverage and coordination - & limiting/focusing interventions – ensuring evaluability Specification of evaluation responsibilities, processes and partner involvement Availability of data and data sets Expertise and provisions for evaluation independence Use and communication of evaluation Quality management strategy Focus and rationale for the evaluation concerned Timetable and budgets

5 Good Evaluation Planning An evaluation plan is not free standing It is embedded in the programme cycle reflecting : National and regional strategies and priorities European strategies and priorities - ‘smart, sustainable and inclusive growth’ Making comparisons and aggregation possible It provides a basis for: Assessing and addressing data needs Drawing up Evaluation Terms of Reference Selecting evaluation contractors Judging evaluation quality Translating evaluation outputs into recommendations & practical actions

6

7 Good Evaluation Planning Putting together an Evaluation Plan – who needs to be involved? Managing authorities and member states who need assistance to make judgements about evaluation quality Evaluation managers and commissioners who need tools to improve the quality of ongoing evaluations Policy makers who will need to make judgements about the robustness and credibility of evaluation evidence when using evaluation findings to develop future policies; and Evaluation practitioners who need clear statements of evaluation quality expectations and standards in order to improve their own practice

8 Good Evaluation Planning Programming period emphasises Results and Impacts. This is not easy! Identifying impacts – construct validity – hence importance of consulting and involving beneficiaries Disentangling multiple causes and effects Distinguishing what results can be attributed to SF interventions or other influences Recognising contextual influences and scope for generalisation Matching time trajectory of evaluations & programme results which differ e.g. enterprise support versus infrastructure development - & are often ‘emergent’

9 Good Evaluation Planning Commission Guidance identifies two families of evaluation approach for Impact Evaluations: Theory based evaluations which focus on how and why programmes have an effect – that depend on opening up the ‘back box’ and identifying causal mechanisms (intervention logics) and contextual factors Counterfactually based evaluations which compare what would have happened with or without the intervention by using control groups & statistical analysis

10 Good Evaluation Planning Operationally the planning of Impact Evaluations usually requires combining designs and methods: Qualitative and quantitative Different designs – statistical, experimental, theory based, participatory Choice of designs and methods depends on three considerations: Evaluation questions Characteristics of programmes Capability of designs and methods

11 Good Evaluation Planning This can be summed up in a ‘design triangle’:

12 Good Evaluation Planning Typically in results- oriented or impact evaluations we can ask different kinds of Evaluation Questions To what extent can a specific impact be attributed to the intervention? – counterfactual question Did the intervention make a difference? – a contribution question How much of a difference did the intervention make – a statistical question How has the intervention made a difference? – an explanatory question Will the intervention work elsewhere? – a generalisability question

13 Good Evaluation Planning But what the ‘design triangle’ suggests is that some questions are appropriate in some circumstances but not others, for example: If a programme is implemented differently in different settings asking a question like ‘did it work?’ is too simple – we need to ask how did it work in different contexts and why If a programme involves very few cases as with support for large enterprises then statistical analysis may not be possible When there is much prior policy experience and even pre-existing theory then it is easier to test programme theory – otherwise an evaluation also has to develop its own theory through an iterative design and tracking the implementatyion process

14 Good Evaluation Planning

15 Overall research has shown that programme interventions seldom ‘work’ in isolation. They usually interact with other programmes & policies, particular local or institutional resources, historical ‘assets’ and cultural and attitudinal predispositions. Hence the importance of newer methodologies and designs like theory based, realist and contribution analysis At the very least this makes methodological choices more demanding and poses challenges to identify evaluation skills that are often in scarce supply. Evaluation capacity development and skill recruitment is therefore an important aspect of evaluation planning

16 Good Evaluation Planning Overall research has shown that programme interventions seldom ‘work’ in isolation. They usually interact with other programmes & policies, particular local or institutional resources, historical ‘assets’ and cultural and attitudinal predispositions. Hence the importance of newer methodologies and designs like theory based, realist and contribution analysis At the very least this makes methodological choices more demanding and poses challenges to identify evaluation skills that are often in scarce supply. Evaluation capacity development and skill recruitment is therefore an important aspect of evaluation planning

17 Good Evaluation Planning In summary: Good evaluations require good evaluation plans A good evaluation plan requires an planning process that involves users and stakeholders in a partnership relationship The characteristics and ambition of SF interventions and the focus on ‘results’ and ‘impacts’ have made evaluation designs more methodologically challenging This requires skills, evaluation capacity development, data availability & careful design of programmes to be evaluable Careful evaluation planning ensures evaluation quality, use and policy relevant knowledge accumulation which is why it is important to devote time and effort to the evaluation planning process


Download ppt "Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015."

Similar presentations


Ads by Google