Presentation on theme: "Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society."— Presentation transcript:
Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society
Congratulations on the formation of the Slovak Society for Evaluation and welcome to the National Evaluation Societies and Networks in Europe (NESE)
A word about what we mean by evaluation (a changing landscape) What the evaluation community in Europe identifies as important What is gained by sharing and working together Discussing the key issue of use and usability
My own background Began evaluation work in 1982 from a research background in educational change, policy and development Help to found the UK Evaluation Society in 1992 President between 2001 and 2003 Chaired the development group which formed the IOCE in 2003 Current president of the European Evaluation Society and Director of the Centre for the Study of Education and Training (CSET), and Professor of Evaluation, Lancaster University Still defining what evaluation is……………………..
My own background
Evaluative practice For me (and this is unstable!!) an evaluative practice is a practice that is the routine, rule governed behaviour prompted by an evaluative impulse i.e. an impulse to attribute ‘value’ or ‘worth’ in some way, to a process, a programme, an object, a policy, a development or an intervention.
Evaluative practice Evaluative practice concerns: the purposeful gathering, analysis and discussion of evidence from relevant sources about the quality, worth and impact of provision, development, policy or practice It is how we attribute value Evaluative practice also concerns things such as : Balancing diverse ethical interests Managing stakeholders Charting a way though ‘difficult’ or ‘inconvenient truths’
Ideas for a first work programme for NESE Possible objective Exchanges of information Presentation of good practices Monitoring of evaluation activities and context Encouraging use and usability Possible means Page on EES website Permanent contacts Continued moderation by EES + a national society Meeting in Lisbon - October 2008 Meeting in Munster October 8/9 th 2009
Monitoring evaluation in Europe: supply of evaluators education activities regarding evaluation institutional arrangements within the public sector activities by the supreme audit offices pluralism within each policy domain scope of evaluations
Overall strategy for professional development Filling gaps Complementing national training Contacts, conversations and exchanges of information Jointly sponsoring Thinking about capability, competence and standards
Picking up the issues of use and usability……..
“Use refers to the extent to which the outputs of an evaluation are used as a resource for onward practice, policy or decision making” What counts as Use?
“Usability refers to the way an evaluation design shapes the extent to which it’s outputs can be used” What counts as usability?
‘Use’ enhanced by inclusivity: * Authentication of focus and instrumentation * Interest in outputs/findings * Social capital building * New knowledge as socially owned * Increased chance of changes in practice How do different types of evidence/data determine ‘use’ practices e.g. narratives or stats Political use of different types of evidence How might ‘use’ be encouraged?
Process use refers to the unintended effects of carrying out an evaluation or ‘asking questions’: Foregrounding new issues Drawing attention to ‘hot spots’ or problem areas Forcing attention on difficult areas Providing a ‘voice’ for the powerless Drawing attention to time-lines Making participants think about ‘audience’ and ‘users’ Policing role ‘process use’
use as ‘engagement’ Lessmore Dissemination practice Report Executive summary Article Interactional practice: Working alongside colleagues Analysis of situational enabling and constraining factors Presentational practice: Seminars Presentations Active workshops Embodiments
Embedded in decision making cycles (clear knowledge on when decisions take place and who makes them) Clear understanding of organisational memory (how evaluations might accumulate) Capacity of an organisation to respond Systemic processes (feeding into structures that are able to identify and act on implications) Organisations that are lightly bureaucratised (complex adaptive systems) are better placed to respond to ‘tricky’ or awkward evaluations Evaluations that are strongly connected to power structures (what does this mean?) Evaluations that are congruent: suggestions based on evaluation need to build on what is already in place. Issues concerning “use”
Reasons and Purposes [planning, managing, learning, developing, accountability] Uses [providing and learning from examples of good practice, staff development, strategic planning, PR, provision of data for management control] Foci [activities, aspects, emphasis to be evaluated, should connect to the priority areas for evaluation] Data and Evidence [numerical, qualitative, observational, case accounts] Audience [Community of practice, commissioners, yourselves] Timing [Coincidence with decision making cycles, life cycle of projects] Agency [Yourselves, external evaluators, combination] Designing evaluations for usability: some critical questions
In our survey of last year (15 societies and networks in Europe), these were some of the issues for the evaluation community in Europe Key issues going forward? Evaluation use and usability Raising politicians’ awareness of evaluation Promoting research on evaluation Promoting and defining standards and good practice Supporting evaluation capacity builders in the public service Promoting evaluation training Setting up evaluation societies