Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation.

Slides:



Advertisements
Similar presentations
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Advertisements

1 Practical Issues in Applying CIE Daniele Bondonio University of Piemonte Orientale.
The Role of Pilots Alex Bryson Policy Studies Institute ESRC Methods Festival, St Catherines College, Oxford University, 1st July 2004.
ABC. Question 1 Human capital is defined as: The knowledge, talent, and skills that people possess. A The common knowledge, talent, and skills that all.
Evaluation of the impact of the Natural Forest Protection Programme on rural household incomes Katrina Mullan Department of Land Economy University of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Public Policy Marc Cowling Brighton Business School.
BUSINESS AND FINANCIAL LITERACY FOR YOUNG ENTREPRENEURS: EVIDENCE FROM BOSNIA-HERZEGOVINA Miriam Bruhn and Bilal Zia (World Bank, DECFP)
Regional Policy Common Indicators: Innovation and Productive Investment Definitions and Discussion Brussels, 22 nd November
How can Supply-Side Policies be used to achieve Economic Growth? To see more of our products visit our website at Andrew Threadgould.
SMEs’ Finance and Participation in Global Markets Koji ITO Centre for Entrepreneurship, SMEs and Local Development (CFE) Organisation for Economic.
Econ Prof. Buckles1 Welcome to Econometrics What is Econometrics?
Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences.
Economics 20 - Prof. Anderson
Pooled Cross Sections and Panel Data II
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
New Zealand’s International Trade Towards an integrated approach February 2011.
Prof. Dr. Rainer Stachuletz 1 Welcome to the Workshop What is Econometrics?
TOOLS OF POSITIVE ANALYSIS
Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of.
TEST YOUR KNOWLEDGE LESSON 4: BACK TO SCHOOL ABC Lesson 4: Back to School.
FRANCISCO VELOSO 1 PEDRO CONCEIÇÃO 2 1 Faculdade de Ciências Económicas e Empresariais Universidade Católica Portuguesa 2 Center for Innovation, Technology.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Identifying social effects from policy experiments Arun Advani (UCL & IFS) and Bansi.
Goal Paper  Improve our understanding on whether business training can improve business practices and firm outcomes (sales, profits, investment) of poor.
Locational Determinants of Foreign Direct Investment: A Case of Thailand Dr. Viyada Valyasevi UniSA, 14 Apr
14/04/11 Relaxing Credit Constraints: The Impact of Public Loans on the Performance of Brazilian Firms IDEAS International Assembly 2011 * Corresponding.
Health Programme Evaluation by Propensity Score Matching: Accounting for Treatment Intensity and Health Externalities with an Application to Brazil (HEDG.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Slide Eastern Finance Association Annual Meeting 2009Andreas Dietrich SME Credit Availability Around the World: Evidence from the World Bank’s Enterprise.
Political Winds, Financing Constraints and Pharmaceutical Innovation Joshua Linn (UIC) and Robert Kaestner (UIC and NBER) November 9, 2007 Presentation.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Methodological Problems Faced in Evaluating Welfare-to-Work Programmes Alex Bryson Policy Studies Institute Launch of the ESRC National Centre for Research.
Slide 13-1 Copyright © 2004 Pearson Education, Inc.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Evaluation of an ESF funded training program to firms: The Latvian case 1 Andrea Morescalchi Ministry of Finance, Riga (LV) March 2015 L. Elia, A.
Longitudinal Data Recent Experience and Future Direction August 2012.
WHAT SETS SUCCESSFUL FIRMS APART FROM THE PACK? Presentation to University of Canterbury November 2005 Arthur Grimes Motu Economic & Public Policy Research;
Evaluating the effectiveness of innovation policies Lessons from the evaluation of Latin American Technology Development Funds Micheline Goedhuys
Evaluation of the Norwegian SkatteFUNN scheme Torbjørn Hægeland, Statistics Norway Jan 22, 2004.
Generalizing Observational Study Results Applying Propensity Score Methods to Complex Surveys Megan Schuler Eva DuGoff Elizabeth Stuart National Conference.
Applying impact evaluation tools A hypothetical fertilizer project.
VerdierView Graph # 1 OVERVIEW Problems With State-Level Estimates in National Surveys of the Uninsured Statistically Enhancing the Current Population.
What is randomization and how does it solve the causality problem? 2.3.
Developing the prototype Longitudinal Business Database: New Zealand’s Experience Julia Gretton IAOS Conference Shanghai, China, October 2008
Randomized Assignment Difference-in-Differences
Overview of evaluation of SME policy – Why and How.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
1 Joint meeting of ESF Evaluation Partnership and DG REGIO Evaluation Network in Gdańsk (Poland) on 8 July 2011 The Use of Counterfactual Impact Evaluation.
Public Finance and Public Policy Jonathan Gruber Third Edition Copyright © 2010 Worth Publishers 1 of 24 Copyright © 2010 Worth Publishers.
Using administrative data to produce official social statistics New Zealand’s experience.
Do European Social Fund labour market interventions work? Counterfactual evidence from the Czech Republic. Vladimir Kváča, Czech Ministry of Labour and.
The Evaluation Problem Alexander Spermann, University of Freiburg 1 The Fundamental Evaluation Problem and its Solution SS 2009.
Ratio Analysis…. Types of ratios…  Performance Ratios: Return on capital employed. (Income Statement and Balance Sheet) Gross profit margin (Income Statement)
EVALUATIONS THAT CHALLENGE THE STATUS QUO: USE OF STATISTICAL TECHNIQUES IN MEASURING ADDITIONALITY Australasian Evaluation Society International Conference.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Looking for statistical twins
Annual Joint Programming Conference 23 November 2016
L. Elia, A. Morescalchi, G. Santangelo
The Nature of Econometrics and Economic Data
Quasi Experimental Methods I
Quasi-Experimental Methods
Presentation at the African Economic Conference
1 Causal Inference Counterfactuals False Counterfactuals
Matching Methods & Propensity Scores
Economics 20 - Prof. Anderson
Evaluating Impacts: An Overview of Quantitative Methods
Presentation to Primary Health Alliance 7 June 2019
Presentation transcript:

Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation & Employment of New Zealand oliver.herrmann@med.govt.nz 

Overview Survey data vs administrative data Micro data research in New Zealand Using administrative data for evaluation: an R&D example Impact evaluation, causality and methods Evaluation findings Limitations

Survey data (Last Millennium) Empirical research and evaluation have relied on survey data. Using a sample of individuals from a population with a view towards making statistical inferences about the population. Examples: Current Population Survey (US), Annual Survey of Hours and Earnings (UK), Business Operation Survey (NZ) Political science, sociology, and economics were all “revolutionized” using survey data sources. Empirical research and evaluation have relied on survey data The methodology: using a sample of individuals from a population with a view towards making statistical inferences about the population. Examples: Current Population Survey (US), Annual Survey of Hours and Earnings (UK), Business Operation Survey (NZ) During the second half of 20th century: development of modern survey methods and of statistical techniques for analysing survey data. Enormous insights in the fields of political science, sociology, and economics were all revolutionized using survey data sources. Lessons: include knowledge and additions of qualitative perspective

Administrative data (New Millennium) Governments create comprehensive administrative data that cover socio-economic behaviour from education, earnings, workplace to family composition, health and retirement. Administrative data are highly preferable to survey data because full population files are generally available and offer much larger sample sizes, administrative data have a longitudinal structure that enables evaluators to follow individuals over time and address many critical policy questions, administrative data do not suffer from high rates of non-response, attrition, and under-reporting. Overcome the linear problem

New Zealand’s Hotbed of Microdata Research Statistics NZ has undertaken a number of projects that integrate data supplied by different government agencies The Integrated Data Infrastructure (IDI) is a comprehensive database with longitudinal microdata about individuals, households, and firms The IDI pulls together a range of administrative and survey-based data sources (financial performance, employment, merchandise trade, business practices, government assistance). The IDI allows for the investigation of previously unanswerable questions. Researchers and evaluators have access to answer research, policy, and evaluation questions to support informed decision making. The Integrated Data Infrastructure (IDI) is a comprehensive person- and firm-level database for Government researchers to utilise The IDI pulls together a range of administrative and survey-based data sources, resulting in a rich and multi-dimensional In addition to contextual and other information

Using administrative data for evaluation: An example - publicly funded R&D Innovation is well known to be an important driver of economic growth and investments in R&D are among the factors that drive innovation. Governments encourage business investment in R&D, with the aim of correcting or alleviating two main market failures: Difficulties by firms to fully appropriate the returns to their investment. Difficulties in finding external finance, in particular for small start-up firms.

Policy objectives of R&D grants Increase and enhance the R&D. Increase the economic performance. In New Zealand there are different types of government R&D subsidies assistance to build R&D capability assistance for R&D projects

Impact evaluation = causal inference Identify the causal effect of a policy or intervention. We did program X, and because of it, Y happened. Use the results to decide whether the programme should be continued/terminated/expanded/modified Y happened because of X, not for some other reason. Thus it makes sense to think that if we did X again in a similar setting, Y would happen again In a more research-friendly universe, we’d be able to observe a single firm (called company ICT) after government both gave and didn’t give R&D assistance: Ytreated ICT - Yuntreated ICT

Evaluation problem (1) In reality, we observe one outcome only The outcome we do not observe is called the counterfactual. In order to estimate the impact of treatment, we need to estimate the counterfactual outcome Is the solution to get more observations? Then we can calculate: Average(treated) - Average(untreated) But what if there’s an underlying difference between the treated and untreated (selection bias)?

Evaluation problem (2) Simplest world : impact is common across all individuals/firms Impact is different across individuals/firms but doesn’t affect participation/selection => variation in impact has few policy implications Impact is different across individuals/firms and this affects participation/selection => implications for policy and for unbiased estimation

Non or Quasi-experimental approaches We use non-experimental approaches to build a counterfactual - selection on observables (matching) - selection on unobservables (Difference in differences)

Control group and eight matching variables Employment and change in employment Total productivity and change in total productivity Capital intensity and change in capital intensity Exporting industry Firms in group Firm age R&D activity Ownership status

Actual and counterfactual outcomes Changes in performance of assisted firms to matched similar New Zealand firms. The additional impact result of the public subsidy alone Comparing actual and counterfactual outcomes.

Evaluation findings Even before receiving R&D grants assisted firms have higher sales, are larger, more capital intensive, exporting and undertaking R&D. Significant impact on economic performance of the firms due to the R&D subsidies. “Capability Building” has a positive impact on employment growth, sales, value added, and productivity. Counterintuitive finding: No impacts for “Project Funding” ! Positive impacts for small firms. Positive impacts for firms that had not undertaken R&D two years prior to receiving their assistance. No impacts for large firms and no impacts for prior R&D performers.

Limitations and related evaluations Data limitations (4 years after receiving assistance) Previous evaluations (case studies and surveys) showed very positive outcomes. Before/After comparison (completion of the grant) does not inform about any additional impact. Most previous evaluations have overestimated the economic impact. Our methods address this question by comparing the performance of an assisted firm to a matched similar firm.

Additional slides

Potential outcomes framework (Rubin, 1974)

Main identification problem is the lack of a counterfactual response

Matching Match each treated participant to one or more untreated participant based on observable characteristics. After matching treated to most similar untreated, subtract the means, calculate average difference

Issues Selection/participation depends only on observable characteristics; it does not depend on unobservables Difficult to match on many observables X => condense all observables into one “propensity score,” match on that score. Need good data

Difference in differences Before – after : Growth in outcomes for assisted firms Firm with assistance Outcome, Y (e.g. sales) Time t1 t2 Zero growth O Y0 Y1 Y'

Difference in differences Comparison of outcomes post treatment Y(b) – Y(d) Firm with assistance Outcome, Y (e.g. sales) Time t1 t2 O Y0 Y1 Y' b Control firms without assistance a d c

Difference in differences = Growth in sales for assisted firms – Growth in sales for control firms Firm with assistance Outcome, Y (e.g. sales) Time t1 t2 O Y0 Y1 Y' b e Control firms without assistance a d c