Introduction Extensive Experience of ex-post evaluation of national support programmes for innovation; less experience at regional level; Paper aims to.

Slides:



Advertisements
Similar presentations
Measuring innovation South Asian Regional Workshop on Science, Technology and Innovation Statistics Kathmandu, Nepal 6-9 December 2010.
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Options appraisal, the business case & procurement
Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013 Leroy White University of Bristol Capacity Building Cluster.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Working with the Teachers’ Standards in the context of ITE. Some key issues for ITE Partnerships to explore.
Improving the added value of EU Cohesion policy Professor John Bachtler European Policies Research Centre University of Strathclyde, Glasgow
Information Workshops October 2014 Capability Investment Resource (CIR)
Benefits and Challenges of University - Industry Interactions: A Critical Perspective Jeremy Howells, Ronnie Ramlogan and Shu-Li Cheng Manchester Institute.
Project Monitoring Evaluation and Assessment
IVETTE:Implementation of virtual environments in training and in education Targeted Socio-Economic Research Programme Mario Barajas University of Barcelona.
4 4 By: A. Shukr, M. Alnouri. Many new project managers have trouble looking at the “big picture” and want to focus on too many details. Project managers.
Katherine Smithson Policy and public affairs officer Charity Finance Group.
Chapter 2 A Strategy for the Appraisal of Public Sector Investments.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Standards and Guidelines for Quality Assurance in the European
Retail Marketing Mix and Planning Charles Blankson, Ph.D.
performance INDICATORs performance APPRAISAL RUBRIC
Human capital management
Council for Disabled Children May What is Independent Support? A 2-year programme to provide additional support to young people and parents during.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Developing a result-oriented Operational Plan Training
Adaptive Governance and Policy-making Using the ADAPTool.
Logistics and supply chain strategy planning
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Foundation Degrees Foundation Degree Forward Lichfield Centre The Friary Lichfield Staffs WS13 6QG — Tel: Fax: —
All images © Mat Wright Ensuring quality – what can be learnt from the UK and the rest of Europe? Santiago June 2014 Geoff Fieldsend.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Commissioning Self Analysis and Planning Exercise activity sheets.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
The Role of Government in Building Absorptive Capacity Ken Warwick DTI Knowledge Economy Forum VI 17 April 2007.
1 SMEs – a priority for FP6 Barend Verachtert DG Research Unit B3 - Research and SMEs.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Identifying the Impacts of Technology Transfer Beyond Commercialization FPTT National Meeting, June 12, 2007.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
VIIP Proposal Preparation Workshop November 3, 2015.
Kathy Corbiere Service Delivery and Performance Commission
Academic Expertise for Business (A4B) Knowledge transfer from Academia to Business - Creating Business Impact from your Research May 30 th 2013 Terry Stubbs.
20 1 Introduction to Market Research Prepared for: … in association with HK IPD, IP Australia.
Overview of evaluation of SME policy – Why and How.
Outsourcing of Census Operations United Nations Statistics Division Regional Workshop on the 2010 World Programme on Population and Housing Censuses: International.
IT Leading the Way to Institutional Effectiveness Presenter: Kendell Rice, Ph.D. July 11, 2007.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Continual Service Improvement Methods & Techniques.
EC15: Social Enterprise 9. Public Enterprise Marcus Thompson University of Stirling.
“DEVELOPMENT OF A NATIONAL ICT POLICY ICT Policy in the ECTEL Member States Mr. Donnie Defreitas MSc, (Hav.), ECTEL Caribbean Internet Forum Bay Gardens.
Version VTT TECHNOLOGY STUDIES Evaluating the societal impacts of Public research organisations: A (belated) paradigm shift in the making Kaisa.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
Multimedia Industry Knowledge CUFGEN01A Develop And Apply Industry Knowledge CUFMEM08A Apply Principles Of Instructional Design To A Multimedia Product.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
ENERGY MARKET REFORMS, R&D & INNOVATION, AND CHALLENGES: TURKISH EXPERIENCE Selahattin Murat ŞİRİN Expert Energy Market Regulatory Authority TURKEY.
Assessment of EPQ What is assessed? AO1 Managing the project AO2 Use resources Where’s the evidence? Production log – aims of project, detailed project.
Beach Modelling: Lessons Learnt from Past Scheme Performance Project: SC110004/S Project Summary.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
KTP Assessment Criteria May Assessment system changes New system in place for May 2016 KTP close Aligns with other Innovate UK assessment systems.
Monitoring and Evaluating Rural Advisory Services
Strategic Information Systems Planning
XS2I4MS – Final Event of the Mentoring and Coaching Programme
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
The Social Model for A/T Technology Transfer – AAATE 2010 “From Problem Identification to Social Validation: An Operational Model” Joseph P. Lane,
The SWA Collaborative Behaviors
AICT5 – eProject Project Planning for ICT
The National Approach to Professional Learning
CEng progression through the IOM3
Presentation transcript:

Introduction Extensive Experience of ex-post evaluation of national support programmes for innovation; less experience at regional level; Paper aims to indentify lessons from evaluation at the national level for innovation support at regional level; Starts from discussion of some key features of innovation; Followed by a description of innovation policy instruments; Next there is a discussion of an integrated approach to ex-ante appraisal, monitoring and ex-post evaluation; Some suggestions are then offered about how evaluation should be undertaken. Some additional issues relevant to evaluation of innovation support are then discussed; The paper finishes with a short list of main conclusions. 30 November Lessons from Evaluation of National Innovation Support Programmes, John Barber

Key Features of Innovation The process by which firms master product designs & production processes that are new to them, if not to the world, nation or sector; Includes the introduction of novel products & processes but also their subsequent diffusion throughout the economy & society; Innovation involves a combination of one or more of new technology, new business practices, new markets, organisational change and upgrading of work-force knowledge and skills; Technology consists of knowledge, artefacts, software, skills, designs & prototypes & routines and ways of doing things; Firms obtain technology and knowledge from a wide range of sources; the mix of sources varies systematically across sectors; Innovation is non-linear interactive process involving the individual firm with a wide variety of outside organisations & influences; An innovation system is a network of tangible & intangible institutions & the interactions between them. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 2

Main Elements of National Innovation Policy Support for collaborative research into advanced technologies; Support for mission oriented research, development & systems; Support for creation & development of new technology based small companies; Support for single company R&D &innovation usually via grants, loans or tax breaks; Support for the adaptation & transfer of technology, knowledge & business best practice; Creation of centres of excellence, networks & other sources of technological & business advice; Most can be used or delivered at the regional level 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 3

Evaluation Consists of ex-ante appraisal, in-flight monitoring & ex-post evaluation which should be undertaken as elements of a holistic process such the ROAMEF:  Rationale: why a programme is necessary & should offer value for money;  Objectives: operational, near term (results), impact (ultimate or final);  Appraisal: the process which determines what is to be funded;  Monitoring: collection of data on the achievements against objectives;  Ex-post evaluation: analysis of the outcomes of the programme;  & Feedback: Use of the results in future policy-making. Ex-post evaluation should serve a number of purposes:  Improving programme management;  Informing the design of new programmes & policies;  Demonstrating that the programme has yielded value for money;  Informing budget allocation;  Increasing knowledge about how the economy & society work. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 4

How should we undertake Ex-post Evaluation 1? Both the design & ex-post evaluation of a programme (or policy) should be based on an analysis of the innovation processes being supported; A well-designed programme with clear objectives & a sound mode of operation will suggest what the parameters of an evaluation should be; Four main questions should be addressed in the ex-post evaluation of a programme:  Was the programme appropriate? Did it address a significant weakness in the innovation performance of the region or country concerned? Was it based on a well-founded analysis of the innovation behaviour it was aiming to influence?  Was the programme effective in achieving its objectives?  Did it yield value for money? Do the estimated additional benefits of the programme exceed the identifiable costs or are they expected to do so in the foreseeable future? Was the programme rationale sound?  Was the programme efficient? Did it achieve the estimated benefits at the lowest possible costs? 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 5

How should we undertake Ex-post Evaluation 2? Experience of evaluation of national innovation support programmes shows that their success depends on them being appropriate to the needs & characteristics of the firms or sectors they are intended to help; Programmes also need to address issues which are important to the innovation & competitive performance of the nation, region or sector concerned; If a programme meets these two conditions & is clear about what it is trying to achieve then it has a good chance of success. A key part of any evaluation is to collect evidence about whether its objectives have been achieved; Evidence on operational objectives and (near-term) results is usually obtained fairly quickly, evidence on the (long-term) impacts may only be available some years after the programme is finished; Many (long-term) impacts may be indirect; In some cases an early interim evaluation may need to be followed by a full evaluation some years later. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 6

How should we undertake Ex-post Evaluation 3? Value for money depends on whether a programme has a valid rationale & yields additional benefits which would not occur in its absence & whether these additional benefits exceed any additional costs; Identification of additional benefits depends on whether the correct framework of analysis is applied; The more selective is a programme the higher will be the proportion of supported activity which is additional but the higher will be the costs of programme administration; Both additional benefits & additional costs depend on assumptions about what might have happened in the absence of the programme; specification of this ‘counter-factual’ is often very difficult in the case of innovation; One element in the indirect costs of innovation support is the displacement of similar activities by non-supported firms. This is also very difficult to identify and assess. Programme efficiency depends on minimising the sum of the cost of support provided under the programme plus the amount of the operational costs of the programme for a given level of benefits. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 7

Technology Transfer In an evaluation of technology transfer (TT) the UK DTI devised a simple framework which set out what is involved in a successful transfer of technology; This has five phases in the TT process:  Awareness 1 – the firm becomes aware of a new technology;  Awareness 2 – the firm considers the potential benefits in detail;  Transfer – the technology is acquired from an identified source;  Absorption – the technology is incorporated in designs of products & processes;  Exploitation – the firm learns much more about the technology through use; Successful public support for one or more phases needs the other phases to be addressed other by other means. Ex-post evaluation needs to examine whether this has happened. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 8

Delivery of Innovation Support The success of public support for innovation depends on the effectiveness of the means of delivery; ex-post evaluation should consider this. Programmes may be delivered by public departments, other public agencies, research institutes, universities, consultants etc but in each case the ability to manage & deliver programmes can only be built up over time; Evaluations should examine not only effectiveness of delivery but also the impact on the capabilities & knowledge of the delivery organisation including whether it has learnt from any mistakes & what suggestions it has for improved delivery in the future; Frequent changes in the mix of public support or in the means of delivery prevents this necessary expertise from being built up and may also result in a fragmented delivery infrastructure consisting of a large number of small R&D institutes/technology transfer organisations; These will seek funding from wherever they can & will lack the clear mission statements & objectives essential to effective ex-post evaluation. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 9

Conclusions Innovation is a complex interactive process which differs considerable across firms & sectors; Evaluation consists of ex-ante appraisal, in-flight monitoring & ex-post evaluation. These should be undertaken as part of an integrated process running from the initial conception of the programme through to the final verdict on its performance and should inform future policy-making; Both the design & ex-post evaluation of a programme (or policy) should be based on a thorough analysis of the innovation processes it is trying to influence & of the relevant aspects of the innovation system; Ex-post evaluation should consider whether a programme was appropriate (addressed a significant feature of innovation performance & fitted the needs of those it was trying to help), whether it met its objectives, whether it offered value for money & whether it was operated efficiently; Ex-post evaluation should consider how a programme was delivered & the impact on the capabilities & knowledge of the delivery organisation(s). Evaluation of technology & knowledge transfer programmes should analyse their operation in the context of the overall transfer process. 30 November 2009 Lessons from Evaluation of National Innovation Support Programmes, John Barber 10