Do European Social Fund labour market interventions work? Counterfactual evidence from the Czech Republic. Vladimir Kváča, Czech Ministry of Labour and.

Slides:



Advertisements
Similar presentations
Measuring social added value A model for public authorities
Advertisements

Integra Consulting Services Ltd. Implementing the SEA Directive: Experience from the new EU Member States Martin Smutny.
Active labour market measures and entrepreneurship in Poland Rafał Trzciński Impact Evaluation Spring School Hungary,
Counterfactual impact evaluation: what it can (and cannot) do for cohesion policy Alberto Martini Progetto Valutazione Torino, Italy
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Measuring social added value A model for public authorities Floriana Nappini [ Network: Better Future.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Mywish K. Maredia Michigan State University
Roma Inclusion: Monitoring and Evaluation for Results November 2010, Brussels, Belgium – DG REGIONAL POLICY Joost de Laat (PhD), Economist, Human Development.
BUSINESS AND FINANCIAL LITERACY FOR YOUNG ENTREPRENEURS: EVIDENCE FROM BOSNIA-HERZEGOVINA Miriam Bruhn and Bilal Zia (World Bank, DECFP)
1/55 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 10 Hypothesis Testing.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 9-1 Chapter 9 Fundamentals of Hypothesis Testing: One-Sample Tests Basic Business Statistics.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 7 th Edition Chapter 9 Hypothesis Testing: Single.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Eng Introduction to the application form 17/10/2014 Marie von Malmborg Karin Tjulin Tytti Voutilainen.
Chapter 10 Hypothesis Testing
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 9-1 Chapter 9 Fundamentals of Hypothesis Testing: One-Sample Tests Business Statistics,
Fundamentals of Hypothesis Testing: One-Sample Tests
Culture Programme - Selection procedure Katharina Riediger Infoday Praha 10/06/2010.
Possibilities of Business Support within the Operational Programme Enterprise and Innovation for Competitiveness May 2015.
Dtengineering 1 of 16 The Product Development Process Introduction.
Evaluation Seminar Czech Republic CSF and OP Managing Authorities David Hegarty NDP/CSF Evaluation Unit Ireland.
Information by the Managing Authority on thematic evaluation of EU structural funds in Iruma Kravale Head of Strategic Planning Unit, European.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
European Social Fund 1 Mr Václav Čermák European Commission DG Employment, Social Affairs and Equal Opportunities Unit B3 Czech Republic, Luxembourg and.
Making decisions about distributions: Introduction to the Null Hypothesis 47:269: Research Methods I Dr. Leonard April 14, 2010.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Evaluation Seminar Czech Republic CSF and OP Managing Authorities Session 5: Ex-Ante Evaluation and Lisbon Strategy.
Evaluating Job Training Programs: What have we learned? Haeil Jung and Maureen Pirog School of Public and Environmental Affairs Indiana University Bloomington.
Evaluation of an ESF funded training program to firms: The Latvian case 1 Andrea Morescalchi Ministry of Finance, Riga (LV) March 2015 L. Elia, A.
The Experience of Public & Non-Profit Co-operation in ESF projects Vladimír Kváča Ministry of Labour and Social Affairs ESF Management Department (Prague,
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
„ Innovations and role of state : „ Innovations and role of state : the Polish experience” Krzysztof Gulda Chairman of Team of experts on innovations and.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 8-1 Chapter 8 Fundamentals of Hypothesis Testing: One-Sample Tests Statistics.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Feedback from Peer Review on 'Counterfactual Impact Evaluation' Tamara SMETANOVÁ Ministry of Labour and Social Affairs, Czech Republic Kamil VALICA DG.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Vanguard meeting 18 July 2014 Ministry od Labour and Social Affairs.
Using propensity score matching to understand what works in reducing re-offending GSS Methodology Symposium Sarah French & Aidan Mews, Ministry of Justice.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
What can a CIE tell us about the origins of negative treatment effects of a training programme Miroslav Štefánik miroslav.stefanik(at)savba.sk INCLUSIVE.
Haphe.eurashe.eu 1 Presenter NameEvent Name HAPHE Survey Results First results – EU Level versus Czech Republic HEI All Perspectives Prepared.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Impact evaluation of R&D support program.
Innovative Labour Market Measures to Fight the Crisis Prague, Czech Republic October 20 – 21, 2011.
1 Impact Evaluation in the European Commission Adam Abdulwahab Evaluation Unit, DG Regional Policy Budapest, 6 th May 2010.
September 2005Winterhager/Heinze/Spermann1 Deregulating Job Placement in Europe: A Microeconometric Evaluation of an Innovative Voucher Scheme in Germany.
Looking for statistical twins
L. Elia, A. Morescalchi, G. Santangelo
Quasi Experimental Methods I
9 6 Total Rewards C H A P T E R Training Employees
Impact Evaluation Terms Of Reference
Supporting material for CIE
Development Impact Evaluation in Finance and Private Sector
Counterfactual impact evaluation: Czech experience with control groups
Impact Evaluation Methods
Impact Evaluation Methods: Difference in difference & Matching
Evaluating Impacts: An Overview of Quantitative Methods
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Counterfactual Impact Analysis applied in the ESF-Evaluation in Austria (period ) Contribution to the Expert-Hearing: Member States Experiences.
Feedback from Peer Review on 'Counterfactual Impact Evaluation'
Presentation transcript:

Do European Social Fund labour market interventions work? Counterfactual evidence from the Czech Republic. Vladimir Kváča, Czech Ministry of Labour and Social Affairs Oto Potluka, University of Economics, Prague

Up next…  Introduction  The Intervention  Methods Used and A Few Results  Conclusion  Discussion Contents

CIE HRE counterfactually evaluates EU support in the CZ  CIE HRE – Counterfactual Impact Evaluation of the Operational Programme Human Resources and Employment  Examines the impact of employee training (funded by the European Social Fund) on companies’ employment.  Research in progress ( , 2011.)  See impact-evaluation-of-the-op-hre/ for more info. impact-evaluation-of-the-op-hre/ Introduction

We are looking for the effect of companies training employees  Grants for employers to train employees with the aim to prevent unemployment via improving companies’ competitiveness and peoples’ skills.  Grants between €40,000 and €600,000.  Any company in the Czech Republic eligible, employees in Prague excluded.  There are firms in the sample, consisting of non-applicants (n= ), successful applicants (n= 1447), and rejected applicants (n= 1183). Intervention 4

The selection procedure rules out projects below 65 points Intervention StepDone byOutput Formal check (signatures, compulsory annexes, eligibility, etc.) Ministry officer responsible for a particular call for proposals YES / NO Content appraisal (quality of project proposal assessed according to several criteria) Two referees randomly selected. If opinions too divergent, a third one is added, and the most extreme opinion eliminated. Points, if average < 65 points  NO, if average >= 65  YES. Selection commission (checks previous steps and project ranking, can propose changes in budget) Commission of external stakeholders and ministry officials Checks previous steps, can change project status from YES to NO (never vice versa). Ranks projects by number of awarded points, and decides the cut-off point for supported projects. (Depending on available money Grant contract signedMinistry official and project representative Project author refuses  project is not started (happens rarely).

Counterfactual methods create comparison groups, in many ways  In essence, counterfactuals create comparison groups to answer the question “what would have happened without the support.”  We use a whole range of counterfact. methods: Regression discontinuity Differences in differences and propensity score matching Instrumental variables  Expected outcome is the change in number of employees. Methods

Regression discontinuity examines projects around a cut-off point  Regression discontinuity exploits a selection procedure cut-off at 65 points.  The assumption is that projects barely above 65 points are as good as those barely below. Methods – RDD 1 0 Probability that project is supported Points from assessment

8

9

According to RDD, support for small and medium companies works better Significance: * denotes 10%, ** 5%, *** 1%. Results – RDD Employment ( ) Total difference in number of employees caused by support Difference in number of employees caused by support (company mean) OP HRE costs for one additional employee (EUR) Small firms5 468** 7.33** (1.3, 13.3) Medium firmsinsignif (-6.4, 28.4) n/a Large firmsinsignif (-12.2, 6.2) n/a

PSM matches comparable companies from another sample  Propensity score matching creates a comparison group according to various observable characteristics, i.e. companies similar by size and industry and region, etc.  We draw the comparison group companies from (i) all other eligible companies, (ii) eligible and interested, but rejected, companies. Method – PSM and DiD

DiD compares the supported and comparison groups over time  Difference in differences assumes a similar trend among the treatment and comparison group (see next slide).  In our analysis, we created the comparison group using PSM and then used DiD.  Results: in small and medium companies there is a larger effect on employment. Methods – PSM and DID

contrafactual Average change in number of employees per company, 2008 – Results for companies of all sizes pooled. Treatment group before and after: = 14 Control group before and after: = 28 Impact = = 14. Difference before Difference after Impact jobs per company saved on average 338

Employment ( ) Firm Size Total difference in number of employees caused by support Mean difference in number of employees caused by support (per company) OP HRE costs for one additional employee (EUR) PSM supported: rejected (1447 : 1183 companies) Small ** ** (0.5, 8.5) Medium ** ** (1.1, 12.7) Largeinsignif (-46.1, 6.1) insignif. PSM (supported: uninterested) Small ** ** (0.6, 10.6) Medium ** ** (4.5, 17.9) Largeinsignif (-11.1, 5.1) insignif. 14

IV uses the personal biases of project referees  Instrumental variables exploit the difference in average points over all projects a referee gives (his different "strictness" / "generosity")  We look at similar projects which were (randomly) alocated different referees. Identity of referee influences the chance of support, and thus also the chance of final impact. Methods - IV

Even a good method can fail to provide numerical results, but those aren’t everything  No significant numerical results.  Real lessons learned, however: we are trying to minimize the role of referees for the next period, and we are also aware of the need to select and train referees better, so as to standardize their approach to applications.  We will test whether the weather influences referees’ decisions. Results – IV

Small and medium companies seem to be a better investment  Support of small and medium companies has significantly larger returns on the number of employees (similarly see Mouqué, 2012).  On average, 1.2 persons per project are employed in its implementation.  Using the results in focusing future calls for proposals (2014+).  Expecting relevant results for profits and sales, a questionable causal mechanism within our time frame. Conclusions Mouqué, D. (2012), What are counterfactual impact evaluations teaching us about enterprise and innovation support,

Conducting the evaluation is just the beginning, we must also use it  Further discussion and presentation to the evaluation communities in both the Czech Republic and the EU.  Looking for other programmes to evaluate.  Evaluation timing is important. Conclusions