Download presentation
Presentation is loading. Please wait.
1
Evaluation of R&D Programs: State of the Art
Presented at Washington Research Evaluation Network - WREN Workshop June 6, 2008 Gretchen Jordan Sandia National Laboratories, USA
2
Interest is high– and all over the map
To name just a few – WREN OECD Workshop on Rethinking Evaluation in Science and Technology. Paris (TIP Working Group) Atlanta S&T Policy Conferences American Evaluation Association Annual Conference- RTD Evaluation Topical Interest Group (most international of the TIGs!) European RTD Evaluation Network European Court of Auditors review of Evaluation of European Union RTD Programs Investments in Health Research: Defining the Best Metrics --Canadian Academy of Health Sciences Economic returns of medical research --Swedish Research Council, Medicine NSF Science of Science and Innovation Policy grants program American Chemical Society Petroleum Research Fund Jordan WREN June 2008
3
Research Assessment in U.S.
Strengths Project-level review by peers Bibliometrics data for some fields Retrospective case studies A culture of evaluative inquiry/ experimentation Weaknesses No agreed upon way to assess portfolios Small, splintered R&D evaluation community Few new methods; existing methods often expensive Data issues, including attribution Opportunities Requirements are centralizing, emphasize evaluation Requirements have reached level of labs and bench New computing power WREN Threats Tension between control mentality and nature of scientific work Performance indicators selected/used out of context Reliance on linear model G. Jordan, AAAS 2005 G. Jordan 02/18/2005 3 Jordan WREN June 2008
4
A glimpse of state of the art of RTD Evaluation
From presentations of that Topical Interest Group at the American Evaluation Association 2007 Conference Evaluation Systems Frameworks Methods View 2007 RTD presentations at at Join us at AEA in Denver, Colorado, November 5-8, 2008 Jordan WREN June 2008
5
Evaluation Systems
6
Evaluation in the Policy Cycle
Technology Assessment Wolfgang Polt Foresight Technology Roadmapping Jordan WREN June 2008 6
7
EUROPEAN COMMISSION - Research DG – November 2007
Evaluation system components – FP7 FP 5 FP 6 FP 7 Annual monitoring 5 year assessment FP6 ex post FP7 Ex ante Impact Assessment FP8 Ex ante Impact Assessment FP7 mid-term review Thematic level evaluations National Impact studies 1998 2000 2002 2004 2006 2008 Jordan WREN June 2008 EUROPEAN COMMISSION - Research DG – November 2007
8
Overall Coordination for National R&D Programs
Program Evaluation in National S&T Activities National S&T Planning S&T Level & Trend Analysis Technology Level Assessment National Standard S&T Classification S&T Indicators & Statistical Analysis S&T Foresight & Roadmap R&D Survey & Analysis National R&D Priority Setting National R&D Master Plan(5 years) National R&D Budget Allocation Ministry Action Plan (every year) Ministry R&D Programs Performance Review (every year) National R&D Program Evaluation Overall Coordination for National R&D Programs 2007 AEA Conference KISTEP session
9
Systemic Evaluation Frameworks
10
The intellectual battle has culminated in a ‘national innovation systems’ perspective …
Source: Arnold and Kuhlmann, 2001 Jordan WREN June 2008
11
A Theory-based Framework for Evaluating Diverse Portfolios of Scientific Work
All these work together…Key indicators for innovation bottlenecks/ policy objectives Modes of coordination – effective? Capabilities – Level, mix, availability High risk capital – available where Macro- Institutional Rules as they affect the sector Meso - Performance by sector and arena Commercialization research Quality research Basic research Socio economic outcomes Technical progress Network connectedness INNOVATION Manufacturing research Applied research Development research if not, check for bottlenecks Micro - funds allocation by arena and profile Organizational profiles – do attributes match the profile? RTD arenas – are there sufficient funds Portfolios - need more/ less radical, large scope? Jordan, Hage, Mote Vienna April 24, 2006 Jordan WREN June 2008
12
An Example of Roadmap Jordan WREN June 2008
13
The Logic of Indirect Programs to Diffuse Technologies or Practices:
Analyze and Plan Develop Technical Information Assist Public Entities Assist Businesses Outreach and Partner Provide Tools and Technical Assistance Assist and Fund Purchases Build Infrastructure Reviewing and Reporting Fund and Promote Adoption Federal, state, and local agencies and nongovernmental organizations Investors and financiers, manufacturers, distributors, retailers, architects, engineers, trades people End user organizations, firms and individuals Technical and other personnel in laboratories, government, firms, colleges, universities Create, advance, and package market and technical knowledge to make energy efficiency more accessible and implementable Change the policies, structure and operation of public entities to smooth the advance of energy efficiency and clean energy supply Create and enhance products, create and align market channels, enhance marketing, and develop installation and support infrastructures Adopt, replicate, institutionalize, and enculturate energy efficient and clean energy supply practices and technologies Partnering with or targeting these audiences That produce the following long-term outcomes or impacts Reduced energy use and emissions, increased clean energy supply, and enhanced productivity and global security EERE programs typically undertake these activities To achieve the following intermediate outcomes New knowledge, alternative institutional arrangements and processes, new product and service ideas, new opportunities, Market and product knowledge U.S. Department of Energy We need to describe and measure the expected response of: Knowledge workers Public Entities Businesses and manufacturing End-users Knowing these activities and their corresponding outputs To show how activities are connected to impacts John H. Reed Jordan WREN June 2008
14
Methods
15
Recent attempts to use peer review at a high level recognise systems complexity - and are being driven to use background studies, increasingly using innovation system and evaluation specialists Examples • Finnish NIS Review • EU 5-year assessments • OECD ‘Innovation System’ reviews • EU-CREST ‘Policy Mix’ reviews Evaluation customer Secretariat Panel Expert(s) Background work Erik Arnold Jordan WREN June 2008
16
- using “Follow-Up Chart”
Case Study - using “Follow-Up Chart” <Device development by collaborating with upstream and downstream industrial technology> Potential for practical use Achieved world record performance High In research phase, good collaboration among private companies. Further technical problems requiring solutions before they can be practically applied Network of personal contacts Achieved three times as high as performance by new method. ->Appropriate evaluation based on the R&D results. Increased opportunities for business Post Project Evaluation In the development phase, upstream and downstream industry work closely together. legend Excellent collaboration between industry and universities Plus element Minus element In the practical phase, it can be difficult to maintain effective collaboration among companies. Intermediate evaluation Accelerated R&D efforts by setting very challenging target Low NEDO project encouraged newcomers to collaborate with universities, allocate new budget for R&D, etc. planning 1st yr. 2nd yr. 3rd yr. 4th yr. 5th yr. Post project Evaluation present Jordan WREN June 2008 AEA 2007
17
Historical Tracing Method
Forward tracing from R&D to downstream outcomes Backward tracing from a selected outcome to upstream R&D Innovation 4 R&D Innovation 1 Innovation 2 Innovation 3 ? R&D Target Innovation TIA Consulting, Inc. Jordan WREN June 2008
18
Second Generation Patent Tree for US 5348822, Issued to Ovonic Battery Company in 1994
TIA Consulting, Inc. Jordan WREN June 2008
19
Question 2: What are the Network Characteristics?
Gatekeepers: Bridging Research and Deployment Networks Research Network Deployment Bridging Links Organisations Participating in Both Networks working as Gatekeepers There are 277 gatekeeper organizations 1/3 of the links in each of the two networks are bridging links Franco Malerba, Nicholas Vonortas, Caroline Wagner, Lorenzo Cassi, Nicoletta Corrocher Jordan WREN June 2008
20
Measuring the Interdisciplinarity of a Body of Research
We can also look at integration to see how it is correlated with other measures. Doing this we see that our Integration measure has a very weak correlation with both the number of authors per paper and the number of affiliations per paper. So, working with a lot of people from different departments does not contribute very much to integration as we measure it. David Roessner, Alan Porter, Anne Heberger, Alex Cohen, and Marty Perrault Jordan WREN June 2008
21
Summary and Conclusions
Vision and Agenda of NSF-sponsored 2001 Workshop remain: Vision: More valuable and valued R&D assessment contributes to broad community understanding of S&T innovation systems and how they lead to and are influenced by social, technical, economic, and political development. Proposed Agenda: Better networks, more training Better data and methods Research on Innovation, Societal and Regional impacts, and Complexity of emerging research organizations Communicate Better! Jordan WREN June 2008
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.