Catherine Searle Renault, Ph.D. RTI International

Slides:



Advertisements
Similar presentations
Dr. Sanjaya Lall Professor, Developmental Economics Oxford University, UK.
Advertisements

Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
What is District Wide Accreditation? Ensure Desired Results Improve Teaching & Learning Foster a Culture of Improvement A powerful systems approach to.
Phase 0 Pilot Program Jay Robinson Manager, Capital Programs (407)
Steve Griffitts, President Joe Dunlap, Ed.D., President.
Stephen A. Wandner Senior Economist U.S. Department of Labor Michael Wiseman Research Professor George Washington University.
Ray C. Rist The World Bank Washington, D.C.
©2006 OLC 1 Process Management: The Foundation for Achieving Organizational Excellence Process Management Implementation Worldwide.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
Labour Sponsored Funds in Canada Arlene Wortsman Workshop on Economically Targeted Investments (ETI’s) June 6, 2006.
Human Resource Management Lecture-25. Career (cont..)
Survey of Earned Doctorates National Science Foundation Division of Science Resources Statistics Mark Fiegener, Ph.D. Presentation to Clemson University.
Accountability in Human Resource Management Dr. Jack J. Phillips.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Performance Measurement and Strategic Information Management
Types of Evaluation.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
1 INVESTING IN ARIZONA’S UNIVERSITIES INVESTING IN ARIZONA’S UNIVERSITIES Presentation by The University of Arizona, May 5, 2008.
Competency Models Impact on Talent Management
GUIDELINES FOR REGIONAL DEVELOPMENT PLANS Eligibility Guidelines 1.Regions must establish a regional development authority (applicant). 2.The IEDC will.
An America Built to Last Martha Kanter, Under Secretary Montana State University July 16, 2012.
The Issue of Technology Readiness Level One of the current issues being discussed by the Department of Energy’s Technology Transfer Working Group is the.
Evaluating NSF Programs
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
European Commission Enterprise Directorate General Innovation Policy R&D and Innovation in the Regional Operational Programs Meeting with Regions 11 July.
Pennsylvania’s 21 st Century Workforce Initiatives.
£32m Wave 2 Growth Hub Programme 02 December 2014 Tracy Milsom Claims and Monitoring Manager.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Presentation to CSU Chancellor’s Office March 15, 2010 The California SBDC Program, An Economic Development Engine within CSU.
THE MANAGEMENT AND CONTROL OF QUALITY, 5e, © 2002 South-Western/Thomson Learning TM 1 Chapter 8 Performance Measurement and Strategic Information Management.
Business Model for an Industrial development agency
Too expensive Too complicated Too time consuming.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Outcome Based Evaluation for Digital Library Projects and Services
An America Built to Last Martha Kanter, Under Secretary Illinois Board of Higher Education February 5, 2013.
April, The Governor's Information Technology Initiative Presentation for the Appropriations Committee, Louisiana House of Representatives.
UNIT C ECONOMIC FOUNDATIONS AND FINANCING 5.02 Explain the relationship between economic measurements and economic growth.
Metro and Non-Metro Business Incubators: Similarities and Critical Differences Peter Schaeffer, WVU Randall Jackson, WVU Mark Middleton, WVU Shaoming Cheng,
S14: Analytical Review and Audit Approaches. Session Objectives To define analytical review To define analytical review To explain commonly used analytical.
Identification of national S&T priority areas with respect to the promotion of innovation and economic growth: the case of Russia Alexander Sokolov State.
1 Maryland Life Sciences Advisory Board William E. Kirwan, USM Chancellor Wednesday September 24, 2008.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
4.0 Understanding the Local Economy Exploring the Human Resources/Economic Development Connection Community Choices: Public Policy Education Program 8.
Evaluating Tallahassee’s Future in the New Economy Tim Lynch, Ph.D., Director Julie Harrington, Ph.D., Asst. Dir. Center for Economic Forecasting and.
1. The Research Process Research New Research New Ideas Solve Problems Commercialization Enhanced Scientific Literacy Updated Learning Materials Increased.
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
Job Generation Engines – Business Incubators and Entrepreneurship in Wisconsin University Research Park MG&E Innovation Center Madison, Wisconsin December.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Evaluation of Washington’s Economic Development System WEDA Winter Conference February 12, 2013 Spencer Cohen Senior Policy Advisor Washington Economic.
Information, Analysis, and Knowledge Management in the Baldrige Criteria Examines how an organization selects, gathers, analyzes, manages, and improves.
Legislative Budget Request Mark B. Rosenberg Chancellor Board of Governors State University System of Florida August 10, 2006.
NHMRC Development Grants Overall Objectives The aim of a Development Grant is to progress research to a stage where it can attract investment from.
Understanding Local Economies Goals To present export base theory as a model of the way a local economy works. To relate general export base theory to.
Analytical Review and Audit Approaches
Small Business Innovation Research Program (SBIR) Presented by Sharina Broughton.
A Fundamentally New Approach to Accountability: Putting State Policy Issues First Nancy Shulock Director, Institute for Higher Education Leadership & Policy.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Tazeem Pasha SelectUSA U.S. Department of Commerce Washington, DC Foreign Direct Investment in the United States 1SelectUSA.gov.
Committee of 100 for Economic Development IFFCBANO Conference
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
CAN EFFECTIVE PERFORMANCE MANAGEMENT SYSTEM ALONE HELPS IMPROVE SERVICE DELIVERY? Institute of Municipal Finance Officers & Related Professions Cherèl.
How to Assess the Effectiveness of Your Language Access Program
Product Definition Chapter 4 Paul King.
Massachusetts Department of Higher Education Boston, Massachusetts
Brown County Financial Decision and Support Model
Presentation transcript:

Defining and Measuring Success in Technology-based Economic Development Catherine Searle Renault, Ph.D. RTI International Center for Technology Applications Science, Technology and Economic Growth: A Practicum for States March 23, 2004

Overview Principles of Evaluation and Measurement Theory of Technology-based Economic Development Indicators Ways to Collect Data Example – Evaluation of Maine’s Public Investment in R&D Challenges and Opportunities Lessons Learned Assuming that universities want to increase their technology transfer productivity, these results suggest that open discussion and more information on campus, among both faculty and graduate students, especially in engineering and the life sciences about the issues surrounding academic capitalism would be helpful. There is a great deal of mis-information out there, and increased knowledge would improve the level of the conversation, if nothing else. More importantly, however, is the issue of mixed messages. If the universities want to further their technology transfer missions, they also need to reflect this in their tenure and promotion as well as conflict of interest policies.

Why Measure and Evaluate? Evaluation is the collection, analysis, interpretation and communication about the effectiveness of programs undertaken for the public good. Aids in decisions about whether program should be expanded, continued, improved or curtailed Increase the effectiveness of program management To satisfy calls for accountability To measure the impact on the core problem

Key Concepts Evaluation is a process, not an event. Evaluation is for practical use, not to sit on the shelf. The questions to be answered are derived from the program itself. Compares “What is” with “What would have been” and “What should be” Takes place in a setting where work/programs are ongoing.

Definitions Program, e.g. The Advanced Technology Program, a complete initiative Project, one interaction with a client, e.g. a single ATP award. Input: resources used to produce outputs and outcomes. Outputs: Products and services delivered. The completed products of internal activities. Outcomes: An event, occurrence or condition that is outside the program itself that is important. Intermediate outcomes: important outcomes, but not the end in itself. End outcome: The sought after result

General Logic Model INTERMEDIATE OUTCOMES END OUTCOMES INPUTS OUTPUTS Personnel Facilities Funding Activities Clients Served Awards Made Mid-Point Events Ending Events

Definitions Measurement/Monitoring Impact Measurement What are the outcomes of the program/project? Impact Measurement Calculate economic impact of outcomes Program/Project Evaluation Involves causality

Causality and Attribution To prove causality, you need four conditions: That the outcomes exist That the inputs precede the outcomes That the inputs, outputs and outcomes are related That all other explanations are accounted for. Attribution is weaker, but easier to prove. Clients say (attribute) their results to the program/project.

Principles for Evaluation If more than one program, establish consistent approach for all programs Ensure clear articulation of goals in as concrete terms as possible Be as rigorous as possible in design and analysis to increase validity and credibility, but make tradeoffs reflecting operational issues Gain evaluation at state level as well as data for individual program management

Program Theory-based Evaluation Use the theory behind the intervention to design appropriate indicators of intermediate and end outcomes. Identify the goals and objectives of the program Construct a model of what the program is supposed to accomplish Collect data to compare: Goals Actual observed outcomes What would have happened otherwise, i.e. without intervention Analyze and interpret results

Goals and Objectives of Technology-based Economic Development Improve citizens’ quality of life by: Creating and retaining high quality jobs (defined as higher pay), generally in technology-based businesses Creating and retaining (and in some cases, recruiting) high quality companies, defined as high growth, high paying), generally in technology-based industries Improving the stability and/or competitiveness of local and regional economy through innovation

Logic Model for Technology-Based Economic Development Joint Research; Students Government Funding Competition Government R&D Grants Workforce RESEARCH INSTITUTIONS TECHNOLOGY TRANSFER OFFICE R & D Driven Industry Innovation Economy Basic Research Applied Research Market Debt & Equity Funding Cost of Doing Business Develop Marketing Opportunity Foundation Funding Research Funding

Product/Company Life Cycle Model Basic Research Applied Research Product Launch Enhance Product Product Maturity

Interventions to Build an Innovation Economy Build Research Capacity Company Basic/Applied Research Design for Manufacturing Product Launch Enhance Product Product Maturity Technical Assistance Centers of Excellence Advanced Manufacturing Centers MEP Sea Grant CREES Business Assistance Incubators, Business Development; Science Parks Funding EPSCOR Federal Funding ATP SBIR STTR State Research Grants State - Sponsored Seed Funds SBA Loans

Intermediate Indicators Researchers S&E graduate students Federal R&D grants R&D expenditures Patents Publications New Sponsored R&D with local companies Companies Patents Venture capital raised SBIR and STTRs won Other federal programs (ATP) won M&A activity IPO activity

End Outcome Indicators Average annual earnings of employees Number of high-technology companies in the state/region Number of scientists and engineers employed in the state/region Number of company births, especially high-technology Percent of revenue from outside state Revenue per employee (productivity)

Collecting Data Three possible methods; use one or all: Annual survey of all recipients of (all) programs Use with control group to assess causality Potentially split companies and research institutions Indicator data for states and benchmark states to assess changes in competitiveness. Case studies to understand detailed trends

Key Decisions for Annual Survey Who to survey: universe of companies and researchers; develop single list; sample or all? What is unit of analysis? Company? Project? How frequently: annually? keep respondents on list for 5 years When to survey: July-August good match for government reporting, poor for companies When to analyze data and report: driven by state budget cycles What methods to use: develop innovative and low cost methods to collect data– mail and web How to assess causality: establish a control group for statistical comparison purposes

Issues for Indicator Analysis Linkage with Innovation Index activities Same or related indicators? Degree of analysis Sources that will be consistent over time Data availability – not always available by states, region, county or locality Timeliness of data What are appropriate comparison states/regions?

Decisions to Make about Case Studies How to chose which ones to do Who to interview? We suggest Program Managers, clients, other stakeholders, e.g. Board members, trade associations, related programs. How to ensure reliability and replicability of data Protocol based on indicators Maintain database Consistency of process

Analysis and Interpretation Be descriptive Note trends Especially useful to note benchmark year, e.g. before beginning of program Norm by population, gross state product, etc. Graph for easy interpretation Acknowledge limitations of data, research design

Example – Maine Evaluation of Public Investments in R&D Context of Evaluation Maine has substantially increased its ongoing investments in R&D starting in 1996 Evaluation legislatively mandated, funded by “tax” on R&D investments Required outside experts to perform the evaluation of all public R&D investments. An ongoing process initial evaluation and process design 2001 annual data collection Five year evaluation in 2006

The Three Questions 1. How competitive is Maine’s sponsored R&D and has it improved over time? 2. What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry? 3. What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development?

Results Reported for 2003 How competitive is Maine’s sponsored R&D and has it improved over time? Maine started from a lagging position and is making some gains … but generally just keeping up since other states are also investing heavily. Maine appears to be gaining on other EPSCoR states in SBIR/STTR awards and in venture capital investments.

Results Reported for 2003 What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry? Maine made sizeable investments in research capacity in the late 1990s and the intermediate outcomes are evident: more faculty, more research equipment and facilities, more proposals submitted, more publications. However, there is little change in intellectual property and joint research with industry or commercial outcomes.

Results Reported for 2003 What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development? The state’s R&D investments are reaching the appropriate targets – the clients are overwhelmingly small, R&D companies (less than 10 employees, revenues less than $1 million, less than five years old). The companies are reporting better than average results in employment growth, revenue growth, per capita income, productivity. We detect many elements of causality for gains in SBIR/STTR, intellectual property and venture capital investments.

Challenges and Opportunities Faces at the table change constantly over six-year period General distrust of evaluation process Many programs don’t keep good records and/or contacts with past clients Research design for Technology-based Economic Development challenging because of long lead times for outcomes to develop, difficulty in assessing causality and lack of good measures for innovation per se

Lessons Learned Doing evaluation correctly is not cheap … Surveying, in particular, is time consuming. Excellent tool for program management; less effective, but may be required, for accountability Credibility is linked to your program’s overall positioning; a good evaluation can help, but not necessarily. A bad evaluation is a bad evaluation. The work we are doing in technology-based economic development pays off in the mid- to long-run.