Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University.

Similar presentations


Presentation on theme: "Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University."— Presentation transcript:

1 Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University American Evaluation Association November 4, 2006

2 The Context Changing nature of science Interdisciplinary, collaborative Large initiatives for complex problems Expansion of use of large center grants as research funding mechanism Similar issues reported in the European Union (EU) in connection with the evaluation of Science, Technology, and Innovation (STI) policies Government wide accountability expectations GPRA PART ExpectMore.gov Good science requires good management

3 Evaluation of Large Initiatives National Cancer Institute Transdisciplinary Tobacco Use Research Centers (TTURCs), (2001 – 2003) Centers for Disease Control Prevention Research Centers Network, 2003- 2005 National Institute of Allergies and Infectious Diseases AIDS Clinical Trials Network, Division of AIDS, National Institutes of Health (2005 – present)

4 Evaluation Approach Culture change Collaboration and involvement of researchers, funders, consultants Understand initiative life-cycle Develop initiative logic model Link comprehensive measures and tools to model Keep costs and respondent burden low Assure scientific objectivity and credibility Address multiple purposes and audiences Design for re-use where possible Report and utilize results Provide an opportunity for reflection and learning

5 Initiative Life Cycle Model Conceptual Model Measures Questions Stakeholders Context Motivation Capacity Structure Expertise Support The context includes the organizational structures and organizational constraints that delimit evaluation activities. Issues include: At each stage a wide variety of stakeholders need to be involved both in helping determine what questions should be addressed in evaluation and in providing their assessments of initiative performance and outcomes. At each stage there are a variety of evaluation questions with more prospective questions earlier in the life-cycle and more retrospective ones later. Processes are needed for for prioritizing which questions will be addressed at each stage. Evaluation is an empirical activity. Consequently, measures that are related to the constructs in the conceptual model needed at every stage.

6 Structured Conceptualization Evaluation Methods Needs Assessment Evaluability Assessment Implementation Evaluation Process Evaluation Outcome Evaluation Impact Evaluation Cost-Effectiveness & Cost Benefit Evaluation Secondary Analysis Meta-Evaluation Conceptual Model Formative/Ex Ante Summative/Ex Post Methods Policy Context New Initiatives Strategic Impact Policy Implications Strategic Goals PlanDisseminate Implement Develop Plan Develop Implement Disseminate

7 The TTURC Case Study Transdisciplinary Tobacco Use Research Centers History RFA released 12/98 Grants reviewed 7/99 First award 9/99 Reissuance 9/04 Approximately $75 million in first phase TTURC Life Cycle Model

8 Model Development Concept Map Engage the Community Diversity & Sensitivity Relationships & Recognition Active Dissemination Technical Assistance Training Resear ch Method s Research Agenda Core Expertise & Resources Evaluation System Plan 1 2 3 4 5 6 7 ……. 1 2 3 4 5 6 7 ……. 1 2 3 4 5 6 7 ……. Logic Model InputsOutputsOutcomesActivities Active Dissemin ation Training Engage the Commu nity Technic al Assistan ce Core Exper tise & Reso urces Research Agenda Community Health change

9 AnalysesMeasures Measures & Analyses Conceptual Map & Logic Model Financial Report (SF259a) Financial Report (SF259a) Budget & Justification Budget & Justification Bibliometrics Progress Report Summary Progress Report Summary Publications Expenditures & Carryover Expenditures & Carryover Researcher Form Researcher Form Peer Evaluation Financial Analysis Personnel Report Personnel Report Personnel Analysis Evaluation Analysis Evaluation Analysis Content Analysis Survey Analysis Progress Report (PHS2590) Progress Report (PHS2590)

10 1.How well is the collaborative transdisciplinary work of the centers (including training) accomplished? 2.Does the collaborative transdisciplinary research of the centers lead to the development of new or improved research methods? 3.Does the collaborative transdisciplinary research of the centers lead to the development of new or improved scientific models and theories? 4.Does TTURC research result in scientific publications that are recognized as high-quality? 5.Does TTURC research get communicated effectively? 6.Are models and methods translated into improved interventions? 7.Does TTURC research influence health practice? 8.Does TTURC research influence health policy? 9.Does TTURC research influence health outcomes? Evaluation Questions

11 1.How well is the collaborative transdisciplinary work of the centers accomplished? What are TTURC researcher attitudes about collaboration and transdisciplinary research? How do researchers assess performance of their centers on collaboration, transdisciplinary research, training, institutional support and center management? What are examples of collaboration, transdisciplinary and training activities of the centers? What is the quality and impact of the collaboration, transdisciplinary and training activities of the centers? Do TTURC research publications provide evidence of collaboration and transdisciplinary research, and how do they compare with “traditional” research? How effective and efficient is the management of the TTURCs? Subquestions:

12 Evaluation Questions 1.How well is the collaborative transdisciplinary work of the centers accomplished? Data Sources: Researcher Form Attitudes about Transdisciplinary Research Scale (15 items) Center Collaboration Scale (15 items) Attitudes about Collaboration in Research Scale (8 items) Institutional Support Index (12 items) Overall Ratings of collaboration, transdisciplinary integration, training, institutional support Content Analysis of annual progress reports for activities, results and barriers (code on collaboration, transdisciplinary integration, training, institutional support) Peer evaluation Annual progress reports Publications Bibliometric analysis of publications Collaboration within and across institutions and centers Numbers of fields represented by publications, cited and citing articles, weighted by impact of journals Management analysis Personnel Budget and Financial

13 Researcher Form Each center responsible for generating measures for 3-4 clusters on the map (at least two centers reviewed each cluster) Compiled into measure development database, draft measure produced 25 closed-ended questions each with multiple subquestions Overall performance ratings by outcome area Open-ended Comments 244 specific measurement items proposed across the 13 content clusters

14 Scales and Indexes Attitudes about Transdisciplinary Research Scale (15 items) Center Collaboration Scale (15 items) Attitudes about Collaboration in Research Scale (8 items) Institutional Support Index (12 items) Methods Progress Scale (7 items) Science and Models Scale (17 items) Barriers to Communications Scale (8 items) Center-to-Researcher Communications (5 items) Center External Communications (2 items) Progress on Development of Interventions Index (12 items) Policy Impact Index (4 items) Translation to Practice Index(9 items) Health Outcome Impact Scale (6 items)

15 Researcher Survey 8. Collaboration within the center o. n. m. l. k. j. i. h. g. f. e. d. c. b. a. 95% CI 4.6 4.4 4.2 4.0 3.8 3.6 3.4 a.Support staffing for the collaboration. b. Physical environment support (e.g., meeting space) for collaboration. c.Acceptance of new ideas. d.Communication among collaborators. e. Ability to capitalize on the strengths of different researchers. f.Organization or structure of collaborative teams. g.Resolution of conflicts among collaborators. h. Ability to accommodate different working styles of collaborators. i.Integration of research methods from different fields. j.Integration of theories and models from different fields. k.Involvement of collaborators from outside the center. l.Involvement of collaborators from diverse disciplines. m.Productivity of collaboration meetings. n.Productivity in developing new products (e.g., papers, proposals, courses). o.Overall productivity of collaboration.

16 Content Analysis Code approximately 80-90 project reports per year by the 13 outcome clusters Did three rounds of reliability testing and refinement of coding definitions Final reliability >.9

17 Progress Report Content Analysis – Years 1-3 Collaboration Transdisciplinary Integration External Recognition and Support Science & Models Publications Interventions Communication Policy Implications Translation to Practice Health Outcomes Training Internal Recognition And Support Methods (data from Content Analysis of Annual Progress Report Form PHS2590)

18 Peer Evaluation – Years 1-3 Training Collaboration Transdisciplinary Integration Internal Recognition And Support External Recognition and Support Methods Science & Models Publications Interventions Communication Policy Implications Translation to Practice Health Outcomes

19 Bibliometric Analysis What is a TTURC publication? Results from TTURC research Cites TTURC Grant Number Independent peer evaluation would identify the influence Components of bibliometric analysis Publications, citations, cited (references) Journals of publication, citing, cited Field (Current Contents) Year

20 Bibliometric Analysis Indicators Journal Impact Factor (JIF) – average number of citations of a journal of all articles published in previous two years Journal Impact Factor Journal Performance Indicator (JPI) – average number of publications to date for all publications in a journal in a particular year Journal Performance Indicator Field Journal Performance Indicator – JPI for all journals in a field Field Journal Performance Indicator Adjusted Journal Performance Indicator (Expected Citations) – JPI for a specific type of publication Adjusted Journal Performance Indicator 5-year Impact – Average number of citations to publications over a five year period 5-year Impact

21 Bibliometrics On average, there were.64 more citations of TTURC publications than for other publications in the same journal. On average, there were.6 more citations of TTURC publications than for other publications in the same field. Citation of TTURC publications is significantly higher than for journal and field comparison groups.

22 Bibliometrics Only the two complete years were used in this analysis. Citations lower than expected in year 1, higher in year 2. Citation of TTURC research publications is significantly increasing over time relative to expectation.

23 Financial Analysis (data from Financial Status Reports of grantees) Cumulative Percent of Federal Funds Spent by Grantee

24 Carryover (data from Budget Justification, Annual Progress Report Form PHS2590) Percent of subprojects by center and year that reported a carryover.

25 Delay of project start Unanticipated obstacles Changes in process - practical Other-SpecifyNot stated Reasons for Carryover 0 0.1 0.2 0.3 0.4 0.5 0.6 Causes of Delay or Unanticipated Obstacles Staffing issue Implementation or logistical issue Research/methods issue Granting agency issue Infrastructure issue Other-Specify Not stated 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 (data from Budget Justification, Annual Progress Report Form PHS2590)

26 What Worked Less Promising Researcher Survey – one wave Content Analysis – costly, time consuming Peer Evaluation of publications More Promising Researcher Survey scales Peer evaluation of progress reports Financial Analysis Bibliometrics

27 Conclusions Sustainability Challenges Funding challenges Researcher motivation Methodological Challenges Peer review Bibliometrics Integrating results Organizational Challenges Agency resources Grantee resources External contractors Utilization Challenges Building over multiple time points Building over multiple initiatives


Download ppt "Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University."

Similar presentations


Ads by Google