National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.

Slides:



Advertisements
Similar presentations
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Advertisements

State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
Martin Schuurmans Chair EIT The EIT Sustainable Growth and Competitiveness through Innovation.
Counting Down the Top Ten List for Proposal Writing Royal Roads University Office of Research February 26, 2010.
 Introductions  Webinar etiquette ◦ Please place your phone on MUTE if you are not asking a question or not responding to the presenters. ◦ If you encounter.
CENTRAL EUROPE PROGRAMME SUCCESS FACTORS FOR PROJECT DEVELOPMENT: focus on activities and partnership JTS CENTRAL EUROPE PROGRAMME.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
ADVANCE PAID Proposal Preparation
SBIR STTR Small Business Innovation Research & Small Business Technology Transfer at the National Science Foundation.
B RITISH B ANKERS' A SSOCIATION Operational Risk & the Regulatory Environment Simon Hills Director - Prudential Capital team.
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Opportunities for increasing conservation effectiveness and research collaborations through a developing Conservation Remote Sensing Working Group Robert.
National Public Health Performance Standards Local Assessment Instrument Essential Service:10 Research for New Insights and Innovative Solutions to Health.
Evaluating NSF Programs
Tips for Writing a Successful Grant Proposal Diana Lipscomb Associate Dean for Faculty and Research CCAS.
National Institute of Standards and Technology U.S. Department of Commerce TheTechnology Innovation Program (TIP) Standard Presentation of TIP Marc G.
National Institute of Standards and Technology Technology Administration U.S. Department of Commerce Rosalie Ruegg, Director Economic Assessment Office.
The issue of scholarship in VET institutions delivering higher education Denise Stevens.
Presentation by Wendy Launder General Manager CRC and Small Business Programs.
A Roadmap to Success Writing an Effective Research Grant Proposal Bob Miller, PhD Regents Professor Oklahoma State University 2011 Bob Miller, PhD Regents.
The Role of Fiscal Institutions in Managing the Oil Revenue Boom CEPAL XIX Regional Seminar on Fiscal Policy January 2007 Rolando Ossowski Fiscal Affairs.
20 YEARS OF SCIENTIFIC RESEARCH IN HEALTH/WORK/ENVIRONMENT September 6, 2012 Thoughts of a reviewer Prof Dick Heederik, PhD IRAS, Utrecht University, The.
WII Overview: The Western Innovation Initiative (WII) is organizing, coordinating, and managing resources from various locations for use across the group.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Writing More Effective NSF Proposals Jeanne R. Small Oklahoma City, Oklahoma March 2, 2006 Division of Undergraduate Education (DUE) National Science Foundation.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Sophie Sergent Ifremer European Affairs Department / MariFish WP7 ERANET MariFish COORDINATION OF EUROPEAN MARINE FISHERIES RESEARCH Presentation of MariFish.
National Institute of Standards and Technology Technology Administration U.S. Department of Commerce Accelerating Emerging Technologies to the Marketplace.
A possible Intellectual Assets acquisition initiative for 4TU Proposal.
Funding your Dreams Cathy Manduca Director, Science Education Resource Center Iowa State University, 2005.
Improving Integration of Learning and Management Systems Paul Shoesmith Director of Technical Strategy Becta.
“ Collaborating with Maritime Clusters Around the World” San Diego, California November 18, 2009 Ocean Technology Sector in Newfoundland and Labrador Les.
INDUSTRY PROBE WORKING GROUP DISCUSSION John D. Hewes, Ph.D. Office of Chemical and Biomedical Technologies Advanced Technology Program National Institute.
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable.
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
1 Click to edit Master text styles Second level Third level Fourth level Fifth level Administrative Support for Large- Scale Funding Applications – Session.
Presents: Information for participants: Your microphone will be muted for the formal presentation. If your audio portion the presentation is not working,
Evaluating Risk: The Experience of the Advanced Technology Program (ATP) Stephen Campbell Technology Innovation Program (TIP), NIST American Evaluation.
Innovation through Institutional Integration (I 3 ) Jody Chase, Tribal Colleges and Universities Program Sylvia James, Innovative Technology Experiences.
Atlantic Innovation Fund Round VIII February 5, 2008.
SBIR STTR at the National Science Foundation August 2015.
March 12, SIGCSE Report FOCE Summit Panel 1 Getting to a Future of Computing Education Summit Joseph Urban Texas Tech University.
Small Business Innovation Research Program (SBIR) Presented by Sharina Broughton.
August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR)
The ENGAGE Workshop: Encouraging Networks between Geoscience and Geoscience Education Nicole LaDue, Northern Illinois University ; Michael Hubenthal, John.
How Research Gets Funded A report by Wayne Wakeland from a workshop given at PSU in late Sept. ’06 by The Grant Institute.
1 SBIR/STTR Overview Wang Yongqiang. 2 Federal SBIR/STTR Program ‣ A +$2Billion funding program set-aside for small businesses seeking to early stage.
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
SBIR STTR Small Business Innovation Research & Small Business Technology Transfer at the National Science Foundation.
Local Station Grant Program Goals and Requirements.
Maternal and Child Health Pipeline Training Program HRSA FY 2016 Reviewer Orientation Madhavi M. Reddy, MSPH Division of MCH Workforce Development.
Status Reports: Measuring against Mission National Institute of Standards and Technology U.S. Department of Commerce 1 Technology Program Evaluation: Methodologies.
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
Introduction Extensive Experience of ex-post evaluation of national support programmes for innovation; less experience at regional level; Paper aims to.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
© ARVIR Balancing Funding Priorities for Innovation Projects; Does the South African Government Address the Issue of Portfolio Management?
2. The funding schemes ICT Proposer’s Day Köln, 1 February 2007 The ICT Theme in FP7 How to participate to ICT in FP 7.
Theme : Information, monitoring & research NWRS Workshops October - December
Collaborative Expedition Workshop #71 National Science Foundation
Week 6 Innovation Process
What is €5 billion worth? Magda Gunn, IMI Scientific Project Manager.
Farmers Market and Local Food Promotion Program Grant Writing Workshop
Leveraging America’s Seed Fund
Measuring the Benefits of the Advanced Technology Program
What is the Fusion Industry Association?
Application and Presentation to the
Proposal Presentation to the
Partnership for Research and Innovation in the Health System (PRIHS) /2020 Sean Dewitt, Program Manager, Health, Alberta Innovates Marc Leduc,
Presentation transcript:

National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program Survey Development, Uses, and Findings Stephen Campbell Technology Innovation Program National Institute of Standards and Technology Collaborative Expedition Workshop #71 National Science Foundation March 18, 2008

National Institute of Standards and Technology U.S. Department of Commerce Framework  Concepts to measure  Mission  Broader research interests  Operationalizing the concepts  Quantitative and/or qualitative  Absolute and/or relative  Uses and findings from the data  Summary stats  Fuller analysis (regression)

National Institute of Standards and Technology U.S. Department of Commerce Different Surveys for Different Questions  Did ATP select the “right” types of projects?  “Process” evaluation  Driven by mission and selection criteria  What are the determinants of success in ATP projects?  “Project” evaluation  Define measures of success  Are the selection criteria correlated with success? What are the other determinants?

National Institute of Standards and Technology U.S. Department of Commerce Evaluating the Selection Process  Selection criteria  Innovative  High-risk  Encourage longer research time horizons  Foster collaboration  Broad-based benefits (diffusion)  Appropriate need for ATP funding (counterfactual)  Used the Survey of ATP Applicants  Three surveys fielded for award competitions in 2000, 2002, 2004  Surveyed both awardee and non-awardee companies  Combined with administrative data

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Selection Criteria (Example for Risk)  Quantitative measures for technical risk of ATP project and for a “typical” project at the company  Did ATP fund higher risk projects and/or projects with a greater “stretch” in the risk profile?  Survey questions  “From 0 to 100%, what would you say is the approximate probability that your proposed ATP project could fully achieve its technical goals? (Try to answer based on how you thought about your project when you proposed it to ATP)”  “What is the approximate probability that a typical R&D project at your company could fully achieve its technical goals?”

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Risk (cont’d)

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Selection Criteria (Example for Time Horizon)  Quantitative measures for time to impact on revenues of ATP project and for a “typical” project at the company  Did ATP fund longer time horizon projects and/or projects with a greater “stretch” in the time horizon?  Survey questions  “Approximately how many years after the start of your ATP proposed project could you expect results to first have an impact on company revenues? (Try to answer based on how you thought about your project when you proposed it to ATP)”  “Approximately how many years after the start of a typical R&D project could you expect results to first have an impact on company revenues?”

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Time Horizon (cont’d)

National Institute of Standards and Technology U.S. Department of Commerce Regression Analysis of the Selection Process  Projects with greater technical risk were more likely to receive an award  Projects with a longer time horizon were more likely to receive an award  Companies receiving an ATP award are able to leverage additional funding to the line of research around the ATP project in the post-application period  Evidence of a “value-added” for the ATP selection panels

National Institute of Standards and Technology U.S. Department of Commerce Evaluating Project Success  Content areas  Key Personnel  Subcontracting  Company characteristics  ATP project characteristics  Research effort  Project management  Research outputs  Technology commercialization  Uses the Business Reporting System  Awarded companies surveyed annually from project start to project end  Awarded companies surveyed 2, 4, and 6 years after project completion in a “Post Project Survey”

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Technical Diffusion  Categorical measures for research results and know-how becoming known to others outside of the company  Within 2 years of project end  2 to 5 years after project end  5 to 10 years after project end  10 or more years after project end  Never  Survey questions  “How quickly do you expect critical research results from this project to become known to other outside your company?”  “How quickly do you expect critical research ‘know-how’ from this project to become known to other outside your company?”

National Institute of Standards and Technology U.S. Department of Commerce Technical Diffusion – cont’d  Research results  Within 2 years of project end72%  2 to 5 years after project end24%  5 to 10 years after project end 3%  10 or more years after project end 0%  Never 1%  Research know-how  Within 2 years of project end39%  2 to 5 years after project end42%  5 to 10 years after project end 12%  10 or more years after project end 5%  Never 2%

National Institute of Standards and Technology U.S. Department of Commerce Operationalizing Continued Technology Development  Categorical measures for the significance of technical and non- technical challenges to overcome before achieving widespread benefits of the ATP technology  Very significant  Moderately significant  Somewhat significant  Not significant  Survey questions  “How significant are any additional technical (research, development) challenges that still need to be addressed in order to achieve widespread commercialization?”  “How significant are any additional non-technical (regulatory, business) challenges that still need to be addressed in order to achieve widespread commercialization?”

National Institute of Standards and Technology U.S. Department of Commerce Technology Development – cont’d  Technical challenges  Very significant22%  Moderately significant42%  Somewhat significant 29%  Not significant 7%  Non-technical challenges  Very significant22%  Moderately significant41%  Somewhat significant 27%  Not significant11%

National Institute of Standards and Technology U.S. Department of Commerce Regression Analysis of the Determinants of Project Success  Small companies are more likely than medium or large size companies to experience early revenues from products incorporating the ATP-funded technology  Joint venture (non-lead) partners are less like than single companies or joint venture leads to realize both research and commercial outputs  University participation in a project has a positive effect on knowledge creation and dissemination, as measured by journal publications  Projects with university participation are less like to stop early for adverse reasons, which may reflect essential differences in motivation or characteristics of these projects  Projects with greater innovativeness and risk (closer to the research frontier) are more likely to realize technical and commercial outputs

National Institute of Standards and Technology U.S. Department of Commerce Concluding Thoughts  ATP was “blessed” to be such a controversial and scrutinized program  Early dedication to evaluation must be accompanied by resources (don’t forget about underlying database construction, maintenance, and integration issues)  Will always face a budget constraint and must prioritize  Basic reporting  Larger, broader research portfolio interests  “Boundary of the firm” issues