1 Defining and Measuring Success in Technology-based Economic Development Catherine Searle Renault, Ph.D.RTI InternationalCenter for Technology ApplicationsScience, Technology and Economic Growth: A Practicum for StatesMarch 23, 2004
2 Overview Principles of Evaluation and Measurement Theory of Technology-based Economic DevelopmentIndicatorsWays to Collect DataExample – Evaluation of Maine’s Public Investment in R&DChallenges and OpportunitiesLessons LearnedAssuming that universities want to increase their technology transfer productivity, these results suggest that open discussion and more information on campus, among both faculty and graduate students, especially in engineering and the life sciences about the issues surrounding academic capitalism would be helpful. There is a great deal of mis-information out there, and increased knowledge would improve the level of the conversation, if nothing else. More importantly, however, is the issue of mixed messages. If the universities want to further their technology transfer missions, they also need to reflect this in their tenure and promotion as well as conflict of interest policies.
3 Why Measure and Evaluate? Evaluation is the collection, analysis, interpretation and communication about the effectiveness of programs undertaken for the public good.Aids in decisions about whether program should be expanded, continued, improved or curtailedIncrease the effectiveness of program managementTo satisfy calls for accountabilityTo measure the impact on the core problem
4 Key Concepts Evaluation is a process, not an event. Evaluation is for practical use, not to sit on the shelf.The questions to be answered are derived from the program itself.Compares “What is” with “What would have been” and “What should be”Takes place in a setting where work/programs are ongoing.
5 DefinitionsProgram, e.g. The Advanced Technology Program, a complete initiativeProject, one interaction with a client, e.g. a single ATP award.Input: resources used to produce outputs and outcomes.Outputs: Products and services delivered. The completed products of internal activities.Outcomes: An event, occurrence or condition that is outside the program itself that is important.Intermediate outcomes: important outcomes, but not the end in itself.End outcome: The sought after result
6 General Logic Model INTERMEDIATE OUTCOMES END OUTCOMES INPUTS OUTPUTS PersonnelFacilitiesFundingActivitiesClients ServedAwards MadeMid-Point EventsEnding Events
7 Definitions Measurement/Monitoring Impact Measurement What are the outcomes of the program/project?Impact MeasurementCalculate economic impact of outcomesProgram/Project EvaluationInvolves causality
8 Causality and Attribution To prove causality, you need four conditions:That the outcomes existThat the inputs precede the outcomesThat the inputs, outputs and outcomes are relatedThat all other explanations are accounted for.Attribution is weaker, but easier to prove.Clients say (attribute) their results to the program/project.
9 Principles for Evaluation If more than one program, establish consistent approach for all programsEnsure clear articulation of goals in as concrete terms as possibleBe as rigorous as possible in design and analysis to increase validity and credibility, but make tradeoffs reflecting operational issuesGain evaluation at state level as well as data for individual program management
10 Program Theory-based Evaluation Use the theory behind the intervention to design appropriate indicators of intermediate and end outcomes.Identify the goals and objectives of the programConstruct a model of what the program is supposed to accomplishCollect data to compare:GoalsActual observed outcomesWhat would have happened otherwise, i.e. without interventionAnalyze and interpret results
11 Goals and Objectives of Technology-based Economic Development Improve citizens’ quality of life by:Creating and retaining high quality jobs (defined as higher pay), generally in technology-based businessesCreating and retaining (and in some cases, recruiting) high quality companies, defined as high growth, high paying), generally in technology-based industriesImproving the stability and/or competitiveness of local and regional economy through innovation
12 Logic Model for Technology-Based Economic Development Joint Research; StudentsGovernmentFundingCompetitionGovernmentR&D GrantsWorkforceRESEARCH INSTITUTIONSTECHNOLOGYTRANSFEROFFICER & DDrivenIndustryInnovationEconomyBasicResearchAppliedResearchMarketDebt &EquityFundingCost ofDoingBusinessDevelopMarketingOpportunityFoundationFundingResearch Funding
13 Product/Company Life Cycle Model BasicResearchAppliedResearchProductLaunchEnhanceProductProductMaturity
14 Interventions to Build an Innovation Economy Build Research CapacityCompany Basic/Applied ResearchDesign for ManufacturingProduct LaunchEnhance ProductProduct MaturityTechnical AssistanceCenters of ExcellenceAdvanced Manufacturing CentersMEPSea GrantCREESBusiness AssistanceIncubators, Business Development; Science ParksFundingEPSCORFederal FundingATPSBIRSTTRState Research GrantsState - Sponsored Seed FundsSBA Loans
15 Intermediate Indicators ResearchersS&E graduate studentsFederal R&D grantsR&D expendituresPatentsPublicationsNew Sponsored R&D with local companiesCompaniesPatentsVenture capital raisedSBIR and STTRs wonOther federal programs (ATP) wonM&A activityIPO activity
16 End Outcome Indicators Average annual earnings of employeesNumber of high-technology companies in the state/regionNumber of scientists and engineers employed in the state/regionNumber of company births, especially high-technologyPercent of revenue from outside stateRevenue per employee (productivity)
17 Collecting Data Three possible methods; use one or all: Annual survey of all recipients of (all) programsUse with control group to assess causalityPotentially split companies and research institutionsIndicator data for states and benchmark states to assess changes in competitiveness.Case studies to understand detailed trends
18 Key Decisions for Annual Survey Who to survey: universe of companies and researchers; develop single list; sample or all?What is unit of analysis? Company? Project?How frequently: annually? keep respondents on list for 5 yearsWhen to survey: July-August good match for government reporting, poor for companiesWhen to analyze data and report: driven by state budget cyclesWhat methods to use: develop innovative and low cost methods to collect data– mail and webHow to assess causality: establish a control group for statistical comparison purposes
19 Issues for Indicator Analysis Linkage with Innovation Index activitiesSame or related indicators?Degree of analysisSources that will be consistent over timeData availability – not always available by states, region, county or localityTimeliness of dataWhat are appropriate comparison states/regions?
20 Decisions to Make about Case Studies How to chose which ones to doWho to interview? We suggest Program Managers, clients, other stakeholders, e.g. Board members, trade associations, related programs.How to ensure reliability and replicability of dataProtocol based on indicatorsMaintain databaseConsistency of process
21 Analysis and Interpretation Be descriptiveNote trendsEspecially useful to note benchmark year, e.g. before beginning of programNorm by population, gross state product, etc.Graph for easy interpretationAcknowledge limitations of data, research design
22 Example – Maine Evaluation of Public Investments in R&D Context of EvaluationMaine has substantially increased its ongoing investments in R&D starting in 1996Evaluation legislatively mandated, funded by “tax” on R&D investmentsRequired outside experts to perform the evaluation of all public R&D investments.An ongoing processinitial evaluation and process design 2001annual data collectionFive year evaluation in 2006
23 The Three Questions1. How competitive is Maine’s sponsored R&D and has it improved over time?2. What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry?3. What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development?
24 Results Reported for 2003How competitive is Maine’s sponsored R&D and has it improved over time?Maine started from a lagging position and is making some gains … but generally just keeping up since other states are also investing heavily. Maine appears to be gaining on other EPSCoR states in SBIR/STTR awards and in venture capital investments.
25 Results Reported for 2003What is the impact of Maine’s R&D investment on the development of Maine’s R&D industry?Maine made sizeable investments in research capacity in the late 1990s and the intermediate outcomes are evident: more faculty, more research equipment and facilities, more proposals submitted, more publications. However, there is little change in intellectual property and joint research with industry or commercial outcomes.
26 Results Reported for 2003What is the impact of Maine’s R&D investment on the level of innovation and innovation-based economic development?The state’s R&D investments are reaching the appropriate targets – the clients are overwhelmingly small, R&D companies (less than 10 employees, revenues less than $1 million, less than five years old).The companies are reporting better than average results in employment growth, revenue growth, per capita income, productivity.We detect many elements of causality for gains in SBIR/STTR, intellectual property and venture capital investments.
27 Challenges and Opportunities Faces at the table change constantly over six-year periodGeneral distrust of evaluation processMany programs don’t keep good records and/or contacts with past clientsResearch design for Technology-based Economic Development challenging because of long lead times for outcomes to develop, difficulty in assessing causality and lack of good measures for innovation per se
28 Lessons LearnedDoing evaluation correctly is not cheap … Surveying, in particular, is time consuming.Excellent tool for program management; less effective, but may be required, for accountabilityCredibility is linked to your program’s overall positioning; a good evaluation can help, but not necessarily. A bad evaluation is a bad evaluation.The work we are doing in technology-based economic development pays off in the mid- to long-run.