Presentation is loading. Please wait.

Presentation is loading. Please wait.

New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference.

Similar presentations


Presentation on theme: "New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference."— Presentation transcript:

1 New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference on IoIR

2 Starting Point Starting point: The come back of evaluation Main source mobilised: the recent review conducted for OECD with Like Georghiou Two main “surprises” (compared to 1990s)... - growing trust by policymakers and stronger articulation with decision making process - changing set of issues (institutional transformation, changing understanding of challenges faced) … Which question evaluation practices and requires reconsidering part of the accumulated knowledge base on evaluation

3 Three main messages Rethinking delivery: “Being taken seriously” drives to new requirements on evaluation products and delivery processes Changing foci require to adapt processes (an issue as much for policy makers as for research analysts) For research in evaluation methods - less an issue of refining existing tools - than of reconsidering approaches to existing questions - and of new designs for changing understandings

4 The presentation Focus on changing foci 1- system review and institutional “renewal” 2- excellence, frontier science and the evaluation of “institutes” Analysing for each: delivery, process and methodological issues.

5 I- NSI & institutional transformation: The “old” model revisited 1.Political shaping of the institutional framework: revisiting the external (OECD reviews) - internal (advisory bodies) divide: - reporting mechanisms - new forms of external reviews 2.‘research’ evaluation: from the “performance & relevance” of individual instruments (mainly programmes) to - relevance of present funding arrangements - portfolios of programmes

6 System reviews National policy evaluations (OECD without OECD, e.g. Finland) - Based on nationally selected international peers --> ‘authoritative’ reports (credibility for political staff) - Process and methods secondary - Main issue = “pragmatism” in recommendations (what is politically acceptable and feasible in terms of implementation) Monitoring systems (e.g. GPRA and PART). - De facto, mostly new “reporting” systems (reshaping the format of annual reports for institutions) --> one danger (e.g. the Australian University monitoring): forgetting institutions! - Quantitative data = the will of “benchmarking” and/or “ranking” institutions --> One danger: in between the ‘banal’ (what is available) and the ‘ad-hoc’ (no possible comparisons) --> Thus a central research issue for evaluation research = ‘positioning’ indicators

7 Reviewing funding arrangements (1) Complex evaluations of key funding structures (e.g. RCN, FFF and FWF) Main characteristics: professional, international consortia, multiple entry points, de facto encompassing (whatever terms of reference), benchmarking often central, new products (see above) Methodological issues: most evaluations conclude on professionalisation of evaluated body and of corresponding ministry. But what are ‘relevant standards’? Process: consortia selected after very complex and detailed calls while very difficult to anticipate hierarchy of aspects (especially points to deepen) --> one critical issue = administrative shaping of evaluations

8 Reviewing funding arrangements (2): delivery issues The changing landscape of ‘decision-making’: - no longer advice to administrative decision making - more and more feeding into a public debate (which started before and will go on after: see Austria) Impacts on publications - Two step: evaluation files & evaluation synthesis - Need to delineate targeted audiences and the issue of adequate writing of the synthesis Impact on delivery process: not a one-off product but repetitive occurrences of interactions with ‘stakeholders’.

9 Evaluating portfolios of programmes The panel based model (e.g. Finnish academy of sciences, FP) Characteristics: - not a meta evaluation - tackling broader issues (composition, coherence and relevance of portfolio, relevance of implementation structures...) Process problems: - On-going: at best preliminary ‘characterisation’ studies (recipients, effect, evaluations of individual programmes) + usual panel meetings - Problem: the need for ‘professional studies’ of transversal problems identified - Major issue: organising a 2-step panel-based evaluation process Delivery issue: reports mostly ‘boring’ with usual long list of recommendations. How to frame synthesis and interaction?

10 Programme portfolios: methodological issues Still problems at programme level - Relationship between programme aims and evaluation criteria: where was the ‘problem solving dimension’ of FP5 evaluation? Or how to cope with “societal effects” or “social benefits”? - Relationship between effects identified and their interpretation: e.g. discussing the skewed distribution of effects. Major work needed at portfolio level - Analysing the composition of portfolio - Assessing relevance and performance of implementation structures: which references, “benchmarks”... - Benchmarking: the need for a ‘clearing house’?

11 II- Evaluation & capability building A fast growing focus for policy and evaluation - shaping and core funding of institutes by institutions, e.g. Helmholtz society, CSIC, INSERM... - multiplication of programmes for “centres of excellence”, “competence centres”... (see overview by Technopolis) - rapid deployment of national (and regional) systems of evaluation of University research (following the UK RAE) Lines of change: periodic & articulated to institution strategic programming, introducing competitive processes, based on a international peers model reviewing of quality (“excellence”), direct connection with funding

12 Evaluation & ‘Institutes’: process issues Process: the “international peer based model” (with even delegations to outside bodies: e.g. EMBO). - How to cope with other aspects than ‘academic quality’? - Path & organisational dependency: how can the model internalise these? - Critical ‘ex-ante’ shaping by required formats (often highly specified) Delivery: articulation between evaluation & funding - Often blurred mechanisms within institutions - The specific case of university research: forgetting “universities as institutions” (a key incoherence of most, if not all, existing mechanisms)

13 Capacity building: methodological issues New key words: excellence, fragmentation, attractivity, frontier science... “Picturing” the landscape: - The role of mapping approaches and ‘positioning” tools and indicators (e.g. Shanghai ranking) - Handling the normative dimension: is the more or the higher the better? Measuring transformation: Changing relations between action, outputs, outcomes and effects - time for new (human) capability building - markers of output = new articles, patents, diploma - outcome = capacity mobilised (direct via contracts or indirect via mobility) - effect = performance or result of mobilisation (with all well known problems of attribution) Assessing policy: which relation between given policy support and the construction of new given capabilities? (another type of “project fallacy?)

14 Some conclusions We are still in infancy when discussing the articulation of evaluation and decision making --> impacts on evaluation products, interaction with audiences and ‘administrative shaping of evaluations’ Positioning problems and actor’ capacity and strategy are major issues --> shifting from input- output indicators to “positioning” indicators Growing focus on capabilities ask for important methodological developments Issues of institutional relevance entail new “two step” (or even more) processes.


Download ppt "New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference."

Similar presentations


Ads by Google