Presentation is loading. Please wait.

Presentation is loading. Please wait.

Expert mission on the Evaluation of SMEs and Entrepreneurship Policies and Programmes Resources pack Jerusalem and Neve Ilan, Israel 18-20 th March 2014.

Similar presentations


Presentation on theme: "Expert mission on the Evaluation of SMEs and Entrepreneurship Policies and Programmes Resources pack Jerusalem and Neve Ilan, Israel 18-20 th March 2014."— Presentation transcript:

1 Expert mission on the Evaluation of SMEs and Entrepreneurship Policies and Programmes Resources pack Jerusalem and Neve Ilan, Israel 18-20 th March 2014 Stephen Roper – stephen.roper@wbs.ac.ukstephen.roper@wbs.ac.uk Christian Rammer - Rammer@zew.de

2 Overview of evaluation of SME policy – Why and How 1/6

3 Why evaluate? Evaluation provides – Valuable evidence for developing schemes to make the most of any investment – It provides a validation of the benefits of a scheme and its value – good news – It provides hard evidence which can be used to make a case for continuing or developing a scheme in budget discussions – It enables inter-scheme, inter-regional and international comparisons

4 Steps on SME policy monitoring and evaluation Step 1 – Logic model - Key elements of project monitoring and evaluation framework are defined at the time when the SME supporting measures are approved. What will success look like? Step 2 – Quantitative monitoring - statistics about the number of applications, applications processed and approved and assistance delivery, funds allocation. What was the level of take up? Step 3 – Qualitative monitoring - data and information on application processing and approvals. Did support go to the target firms/individuals? Were applications dealt with fairly and quickly?

5 Steps on SME policy evaluation Step 4 – Review and revise – monitoring data and stakeholder consultation at the end of first phase. Is the scheme going to plan? Are adjustments needed? Step 5 – Impact evaluation - the responsible agency/ministry commissions an independent impact study. How effective has the scheme been? What has been the level of additionality? Step 6 – Final review – review of the lessons from the impact evaluation with stakeholders. What lessons have been learnt? Should the scheme be dropped? Continued? Developed?

6 Step 6 – Final review Review of the lessons from the impact evaluation (Stage 5) with stakeholders. What lessons have been learnt? Should the scheme be dropped? Continued? Developed? Lead responsibility here is with sponsoring department As at Stage 4 - Steering group or stakeholder discussion of impact evaluation report and then decisions Need to look at scheme as part of overall profile of support and at synergies overlaps with other measures Key operational points – Best if timing is planned at outset together with objectives etc. – Final Review process needs to be managed by the sponsoring department/agency – Stakeholder engagement can be very helpful in informing discussions

7 Logic models and evaluation design 2/6

8 Step 1: Logic model – policy design A logic model describes what the scheme aims to do, the resources available and its anticipated outputs and outcomes. What will success look like? It is a kind of causal theory about how the scheme or programme is meant to work It should specify – Objective(s) of the measure – target outputs and outcomes – Targeted enterprises and eligibility criteria – Budget and timeline – Implementation agency – Delivery and monitoring system

9 Logic Model Illustrative logic model for B2B voucher scheme

10 Logic Model Illustrative logic model for an open innovation centre

11 Types of evaluation of SME policy 3/6

12 Types of SME policy evaluation TypeTitleCommentsData needed 1 Measuring take-up Provides an indication of popularity but no real idea of outcomes or impact Scheme management data 2 Recipient evaluation Popularity and recipients' idea of the usefulness or value of the scheme. Very often informal or unstructured.Recipient data 3 Subjective assessment Subjective assessments of the scheme by recipients. Often categorical, less frequently numericRecipient data 4 Control group Impacts measured relative to a 'typical' control group of potential recipients of the scheme Recipient and control group data 5 Matched control group Impacts measured relative to a 'matched' control group similar to recipients in terms of some characteristics (e.g. size, sector) Recipient and control group data 6 Econometric studies Impacts estimated using multivariate econometric or statistical approaches and allowing for sample selection Survey data - recipients and non- recipients 7 Experimental Approaches Impacts estimated using random allocation to treatment and control groups or alternative treatment groups Control and treatment group data

13 Type 2 – Quantitative monitoring Aim to profile operational aspects of the scheme and provide firm list for later impact analysis Key questions: – Who applied for the scheme? – How many were funded? Went ahead? – How long did approval take? – Were all the funds allocated? Responsibility of delivering agent to collect and data management should be part of the service contract Key evaluation failure is lack of good administrative data on scheme

14 Type 3 – Qualitative monitoring Aim to interpret and analyse admin data plus perhaps some ‘key informant’ interviews. Possible questions: – Did projects finish as expected? – Did support go to the expected types of firms? – Where are these firms located? What industries? – Were applications processed fast enough? Might be seen as ‘interim evaluation’ and best if independently done

15 Step 4 – review and revise After ‘interim evaluation’ (Step 3): Is the scheme going to plan? Are adjustments needed? Steering group or stakeholder discussion of interim evaluation report and then changes if necessary Key operational points – Phased project helps to clarify points for R&R to take place – plan at outset – This Review and Revise process needs to be managed by the sponsoring department/agency – Timing and potential changes to scheme need to be part of contract with the delivery partner Typical changes are to eligibility conditions, application processes or administration (e.g. vouchers schemes)

16 Types 4 and 5 – Control group comparisons Compares performance of recipients and a group of similar firms/individuals matched on some quantitative dimension, e.g. growth, exporting etc. Difference in performance is said to be due to the effect of the scheme. Data could be survey or business register type data. Advantages – Relatively simple methodology to apply – Provides ‘hard’ data on impacts not self-assessment Disadvantages – Often difficult to construct relevant control group – Requires information on participants and controls – Costly sometimes – Does not control for self selection into recipient group (see example below) Operational issues – Analysis should be undertaken independently commissioned by sponsoring department – Key issue is design of control group and needs careful thought

17 Type 6 – Econometric approaches Typically based on a survey of recipients and non-recipients. How did performance changes in recipients compare to that of similar firms allowing for firm characteristics and selection bias? Uses regression models to identify policy impact on performance controlling for selection bias Advantages – Seen as ‘best practice’ methodology – Can control for firm/individual characteristics – Can control for selection bias Disadvantages – Costly, complicated and difficult to understand Operational aspects – Analysis should be undertaken independently commissioned by sponsoring department – Requires survey of recipients and non-recipients so costs at least double self-assessment (Type 1) – Telephone interviews can be very cost effective (CATI)

18 Examples of SME policy evaluation 4/6

19 Type 5 Evaluation Example: University Start-up/Business Centres -Federal government initiative to facilitate entrepreneurship at universities and foster partnerships with industry (Germany: EXIST, Austria: AplusB) -Funding of centres (= staff) + events/promotion + start-up support (financial + consulting) -Evaluation: multi-method approach combining qualitative (interviews) and quantitative (control group) elements, considering institutional setting -classifying universities by type of "scientific culture" -detailed analysis of the centres' characteristics and history -impact analysis both on organisational features (path towards "entrepreneurial university") and economic impacts (increase in number of successful start-ups)

20 Level 5 evaluation: Danish Growth Houses – matched control groups Focus – Danish Growth Houses provide advisory and brokering services for high growth firms. Regionally distributed in Denmark. Publicly funded initiative with some EU support in less developed areas. Key competency is business advice and support. Aim – estimate value for money of Danish Growth House network – all companies using network Sept 2008 to March 2011. Assessment April 2013. To include displacement and multiplier effects and identify NPV. Data – control group matched 2650 assisted firms (admin data) against firms from business registry data (done by Statistics Denmark) to get growth differential Methodology – direct effect using matched control group approach. Matching on size, sector, area, ownership and growth track record adjusted for selection bias (- 50 per cent). Estimated displacement and multiplier effects. Key results – estimated effect was an NPV of around 2.6 over two years – so very effective - and also estimate job creation Source: Fact sheet from the Irish Group (2013) ‘Calculation of the socio-economic yield of investments in Vaeksthuset’, Copenhagen. Provided by the Danish Business Administration.

21 Level 6 evaluation: Business Links consultancy and advice – econometric evaluation Focus - Business Links was UK national government service providing either intensive consultancy/advice support or light-touch support. Support period 2003-04. Impact period 2005-06. Surveys 2007 Aim – assess cost effectiveness of business support. Econometric estimates of impact. Survey based estimates of displacement, multipliers etc. to get NPV. Data - Evaluation survey-based, 1000 firms intensive, 1000 firms light- touch and 1000 firms matched control. Methodology - 2- stage Heckman selection models (using information sources as identifying factor) – Probit models for membership of intensive v control group – Regression models for log growth support Key results – intensive support raised employment growth rate significantly – no sales growth impact Source: Mole, K.F.; M. Hart; S. Roper; and D. Saal. 2008. Differential Gains from Business Link Support and Advice: A Treatment Effects Approach. Environment and Planning C 26:316-334.

22 Business Links evaluation – long term follow-up using administrative data Not an evaluation as such but follow up of groups using business register data Able to monitor growth over longer period – 2003 assisted, 2010 study Evidence: clearly ‘Intensive Assist’ have grown rapidly and more sustained way Conclusion – support had significant long-term effects (but not controlled properly so care needed in interpretation!

23 Level 6 evaluation: Interim evaluation of Knowledge Connect – innovation voucher scheme (2010) Focus – Impact Evaluation Framework (IEF) compliant evaluation of local innovation voucher programme in London intended to increase collaboration between SMEs and higher education. Study period April 2008 to Sept 2009. Survey first quarter 2010. Aim – Interim and formative impact and value for money evaluation including assessment of displacement and multiplier effects – combined into final NPV calculation. Key aims to test rationale for continuation and suggest developmental areas. Included assessment of Strategic Added Value. Data – telephone interviews with 175 recipients and 151 matched non-recipients and around 10 in-depth case studies. Mixed methods approach. Methodology – 2 stage modelling of impact as well as subjective assessment of additionality and partial additionality. Survey based estimates of displacement and multiplier effects to give NPV. Modelling of behavioural effects. Key results – positive effects but under-performing target and low NPV. Some behavioural effects. Short time line perhaps responsible and may under-estimate benefits. Poor admin data made following up clients difficult. Source: The Evaluation Partnership/Warwick Business School (2010) ‘Interim Evaluation of Knowledge Connect – Final report’.

24 Level 7 evaluation: Creative Credits an experimental evaluation Focus – experiment to: (a) evaluate effectiveness of new voucher to link SMEs and creative enterprises – novel instrument; and, (b) test value of RCT+ approach (qual and quant elements) Aim – Impact and value for money evaluation (no assessment of displacement or multiplier effects) Data – survey data for treatment and control group. 4 surveys over 2 years. 150 treatment and 470 control group (applicants but who did not get voucher) Methodology – RCT with simple randomisation Firms either receive voucher or not. Also 2- stage Heckman modelling to check results. Needed extensive chasing to maintain participation in study Key results – positive additionality (innovation and growth) after 6 months but no discernible effect after 12 months Source: An Experimental Approach to Industrial Policy Evaluation: the case of Creative Credits, Hasan Bakhski et al, June 2013, ERC Research Paper 4.

25 Level 7 evaluation: Growth Accelerator Pilot Programme (on-going) Focus – evaluation of leadership and management (LM) support and coaching for high growth small firms through the UK Growth Accelerator programme. Intervention 2014. Survey follow-up 2015 and 2016 (probably). Impact periods 12 and 24 months Aim – Impact evaluation (no assessment of displacement or multiplier effects) Data – administrative and baseline survey data with benchmarking material from the coaching process. 300 in each arm of the trial initially Methodology – RCT with stratified randomisation (sector, size, geography). Firms either receive (incentivised) LM training and coaching at second stage or nothing. (Initial LM training is provided free) Key results – n/a Source: programme documents, UK Growth Accelerator information at: http://www.growthaccelerator.com/

26 Measuring additionality - survey and econometric approaches 5/6

27 Impact evaluation: Key questions… Big question is whether the scheme was working in a cost effective way? Typically quantitative but may also be qualitative elements Need to consider: – What was the degree of additionality and partial additionality? – What was the value of the benefits? How long are they going to last? – What was the extent of deadweight (or zero additionality)? – Was there any displacement or multiplier effect? – Was the cost – benefit balance positive?

28 Step 5 – impact evaluation Self-assessment of impacts by recipients Based on a survey of recipients only. Was the scheme seen as effective and useful? What difference did the scheme make? What was the degree of additionality? Advantages – Relatively cost effective – Requires only admin data for sample – High levels of compliance as firms have benefitted from scheme Disadvantages – Subjective impact assessment only (although see below) – No real counter-factual Operational issues – Survey should be undertaken independently commissioned by sponsoring department – Telephone interviews can be very cost effective (CATI)

29 Impact evaluation Illustrative questions Additionality and partial additionality Scale of impact If you had not received support from scheme XXX would you have: – Undertaken the project anyway in the same way – Done the project more slowly – Done the project on a smaller scale – Not done the project at all Ask turnover If you had not received support from scheme XXXX would your turnover have been: – >40 per cent lower – 20-40 per cent lower – 1-20 per cent lower – The same – 1-20 per cent higher – 20-40 per cent higher – >40 per cent higher

30 Identifying Additionality (1) Simple Difference (before/after comparison): What did a treated firm do after treatment (w.r.t. the programme's goal)? (potentially controlling for other factors influencing the goal) (2) Difference in Difference: What did a treated firm do after treatment, compared to similar firms not receiving treatment? (3) Difference in Difference with common trend control: What did a treated firm do after treatment, compared to similar firms not receiving treatment, and considering pre-treatment trends w.r.t. programme goal

31 Simple Difference pre-treatment post-treatment time goal

32 Difference in Difference pre-treatment post-treatment time goal

33 Difference in Difference with Common Time Trend pre-treatment post-treatment time goal

34 Data Requirements -Panel data on treated and non-treated firms -Information on target variable in pre-period, at start of treatment, and at the end of treatment -large enough sample of non-treated firms (ideally, total population) -variables that are critical for being selected for treatment -factors that may affect the programme's target variable

35 Structural Models - Setting-up an economic model (profit function of the firm) that incorporates the effect of public support - Estimating model parameters (e.g. in a similar way as done in selection- correction models) - Running model simulations on impacts of public support on firm decisions  Demanding data requirements (panel data, key economic variables) Peters, B., Roberts, M., Vuong, V.A. and Fryges, H. (2013), “Estimating dynamic R&D demand: an analysis of costs and long-run benefits”, NBER Working Paper 19374 Arque-Castells, P. and Mohnen, P. (2014), "Sunk costs, extensive R&D subsidies and permanent inducement effects", mimeo Arqué-Castells, P. (2013), “Persistence in R&D Performance and its Implications for the Granting of Subsidies”, Review of Industrial Organization, DOI 10.1007/s11151-013-9381-0 González, X., Jaumandreu, J. and Pazó, C. (2005), “Barriers to Innovation and Subsidy Effectiveness”, Rand Journal of Economics, 36 (4), 930-949.


Download ppt "Expert mission on the Evaluation of SMEs and Entrepreneurship Policies and Programmes Resources pack Jerusalem and Neve Ilan, Israel 18-20 th March 2014."

Similar presentations


Ads by Google