Presentation on theme: "Capturing Effects of Interventions, Policies and Programmes Murray Saunders President EES CSET Lancaster University Sixth European Conference on Evaluation."— Presentation transcript:
Capturing Effects of Interventions, Policies and Programmes Murray Saunders President EES CSET Lancaster University Sixth European Conference on Evaluation of Cohesion Policy Warsaw, 30 November-1 December 2009 New Methods for Cohesion Policy Evaluation: Promoting Accountability and Learning
Involved in evaluations for 25 years: EES, UKES and the IOCE Mainly in fields of evaluating changes and effects of policy (learning, work, ICTs and change processes in development environments (Higher Education, rural development, sustainable change) Believe in inclusive evaluations: methodologically and aspirationally Brief background on experience and perspective
EES position on impact and effects [i [i ess. The EES argues for a multiple method approach to evaluating effects Evaluation literature documents how all methods and approaches have strengths and limitations and that there are a wide range of scientific, evidence-based, rigorous approaches to evaluation that have been used in varying contexts for establishing effects Evaluating effects is complex, particularly in multi-dimensional interventions that require the use of a variety of different methods that can take into account rather than obscure inherent complexity. Unpacking or chunking interventions becomes important in this respect Evaluation standards and principles from across Europe as well from other parts of the world are pluralistic in nature. They do not favour any specific approach or group of approaches.
Policies (cohesion, integration and reducing disparity in social and economic development across member states: the logic of policy intention) Instruments (structural funds resourcing interventions that determine growth: the theory in action of funded and targeted development) Mechanisms (specific programmes, interventions, projects e.g. in transport, human resources, public management: theory of change embedded in specific programmes) Effects (positive changes in behaviour (economic, social, educational, health) brought about by the aggregated determination of mechanisms, instruments and policies) A note on the evaluation map………….
1. The urge to sense make in complex environments (evaluations for knowledge) 2. Social and political imperatives (evaluation and social capital building: issues of transparency, resources, legitimacy and equity) 3. Methodological debate: difficulties and uncertainties in addressing end points (evaluation methodologies: attribution, causality, alignment and design i.e. the basis on which we can say that is working) The concern with establishing effects is in line with international interest in impact and effect evaluations: DG REGIO is providing leadership in this debate within Europe Why a re-emphasis on effects?
An emerging consensus on the need to shift the evaluation focus along the change trajectory implied by structural fund investment Why a re-emphasis on effects? Management of spending (1) Focus on outputs (level 2) Focus on sustainable changes in behaviours of target groups (levels 3&4) Long term aims of cohesion and integration (level 5) The trajectory of focus in structural and cohesion fund evaluation
Towards a holistic approach to indicators of effect ? enabling indicators (new policies, new people, new spaces, new buildings, new roads: focus on necessary conditions for change) process indicators (new changed uses, new attitudes, new cultures: focus on change processes) effect indicators (focus on emergent recurrent behaviours, which produce new resources, multiplier effects)
Changes in sustainable practices (recurrent behaviours ) i.e. community practices, educational practices, commercial and economic practices, governance practices, health practices etc. Sustainable practice clusters brought about by: New protocols and systems New infrastructural entities (transport links, buildings etc) New opportunities and networks New artefacts and tools We look for sustainable changes in clusters of practices in order to establish effects Changes in practice (recurrent behaviour) as the focus for capturing effects in an evaluation
Shifting from volume and descriptions to analysis of effects: have a go in an imperfect world, the direction of European evaluation practice……………….progressive focusing Level 1: Propriety: protocols and due and proper process (was money spent properly, plans adhered to, timelines addressed, consultations occurred, needs addressed, environments scanned) Level 2: Quality of the outputs: roads, buildings, infrastructural development (fitness for purpose) Level 3: Use of the outputs (increase focus on how new infrastructure is used, how it is adapted and modified) Level 4: Emergence of new practices enabled by outputs in social and economic domains Level 5: Impact on macro or long term strategic objectives of cohesion and integration (aggregated and differentiated long term effects) Why effects and the clarity of focus?
Issues in this re- emphasis on effects There is the chimera of certainty and a difficulty in establishing clear lines of determination……………. What do we do about the problem of attribution and causality? strong experimental designs require integration at the inception of interventions large-scale, complex developments present both ethical and practical difficulties for experimental designs experimentation is good on what is working but much weaker on diagnostics (how and why) however, I would argue that there is space for RCTs, counterfactuals as part of an overall approach in specified subsets of activity So what…………..in an imperfect world?
Issues in this new focus on effects Is an alternative paradigm viable? In uncertainty, we need evaluation to provide provisional stability, Creating provisional stability by working with new metaphors courtrooms not laboratories: (using plausible inference) indicative and evocative rather than definitive and one dimensional causality alignment rather than attribution
Mode 1: Indicators interpreted as areas, activities, domains or phenomena on which evidence will be collected (open, designed prospectively) Experimentation included here? Mode 2: Indicators interpreted as the evidence itself (identified retrospectively) Mode 3: Indicators as a pre-defined or prescribed state to be achieved or obtained. In this way indicators constitute desired outcomes or effects (closed and designed prospectively) Indicators of effects There are confusions in the use of indicators:
Issues in this new focus on effects What counts as evidence? Aim for a broad church (n arratives, vignettes, depictions, modelling and statistical analysis of policy recipients experience: what are the behaviours that lead to increases in per capita GDP for example) Openness but still rigorous (critical importance of strong designs) multi method approach to comparison or the counterfactual between stake-holding groups over time between matched groups of programme recipients (quasi experimental) between what is and what might have been Use of evaluation output: organisational capacity to respond to the analysis of effects Usability of evaluation output: accessibility, clear messages for policy and mechanisms
Challenges for the evaluation of effects Have an open and inquisitively sceptical view on method, multi-method is appropriate (balancing designs which require high levels of design predictability with those that are more open, adaptive and use underlying theories of change as starting points) Develop what has been termed a Balanced Score-card Acknowledging and working with the grain of complex, uncertain and multi-layered causality: moving through the levels of focus Adopting the truth that the evaluable i.e. what can be evaluated, is imperfect, unpredictable and often unanticipated Understand evaluation as a tool for creating provisional stabilities Consider the argument that outputs are in diverse forms but that effects are always reducible to changed practices (behaviours)
Evaluation in the public interest: participation, politics and policy The 9th European Evaluation Society International Conference Prague 6-8th October 2010