Presentation is loading. Please wait.

Presentation is loading. Please wait.

Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov.

Similar presentations


Presentation on theme: "Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov."— Presentation transcript:

1 Challenges in Evaluating Basic Science Investments: a Funder’s Perspective
Julia Klebanov

2 Gordon and Betty Moore Foundation
Fosters path-breaking scientific discovery, environmental conservation, patient care improvements, and preservation of the character of the San Francisco Bay Area Founded November 2000 Endowment of $6.4 billion

3 Gordon and Betty Moore Foundation
“We want the foundation to tackle large, important issues at a scale where it can achieve significant and measurable impacts.” –Gordon Moore

4 Science Program Seeks to advance basic science through developing new technologies, supporting imaginative scientists, and creating new collaborations at the frontiers of traditional scientific disciplines. Courtesy of Princeton University We fund research designed to: Advance our understanding of the world by asking new questions Enable new science through advances in technology Break down barriers and cultivate collaborations Enhance society’s understanding of the joy of science 3

5 Science Program Areas Marine Microbiology Initiative
Data-Driven Discovery Initiative Emergent Phenomena in Quantum Systems Initiative Thirty Meter Telescope Imaging Science Learning Astronomy Special Projects Courtesy of Sossina Haile

6 Measurement, Evaluation and Learning in Philanthropy
Measurement: internal process of gathering information to monitor progress in the implementation of our work; occurs on an ongoing basis Evaluation: periodic assessments of ongoing or completed work; conducted both internally or by an external third party Learning: using data and insights from measurement and evaluation to inform strategy and decision-making

7 Measurement, Evaluation and Learning at Moore
Responsible for ensuring access to the best evaluation, data, knowledge management, measurement systems and practices that support evidence-based decision-making Examples: Working with Science program staff to develop measurement frameworks Designing and managing external evaluations Facilitating internal reviews Field-building

8 Developing a Funding/Evaluation Strategy
Unit of funding (e.g. individuals, institutions, projects) Risk—supporting risky research is often a niche for philanthropies Integrating both financial and non-financial supports

9 How do we monitor and evaluate our investments?
Grantee requirements Annual reports—self-reported data Meetings/site visits Track most of the quantitative data to measure scientific output (e.g. publications, presentations, # of instruments developed, etc.) Internal research/strategy reviews External evaluations

10 Conceptual Challenges for Monitoring & Evaluation
Basic science does not follow a linear path Difficulty of setting up measurement framework prospectively for outcomes that cannot be precisely defined Tension of setting aspirational outcomes while being realistic about what’s achievable during the life of portfolio

11 Conceptual Challenges for Monitoring & Evaluation
Many of our initiatives have strategies aimed at developing new ways of thinking, or changing the culture of a research community. Challenge of capturing the nuances of progress towards these types of outcomes Timescale issues Large initiatives typically approved for 5-7 years Conduct evaluations ~4-5 years into the life of the initiative Ultimate impact not expected until many years later

12 Measurement Challenges
Most data self-reported—how can we objectively measure progress of grants? Limited baseline data—how do we collect this for the “state of the field?” Bibliometrics—failure of grantees to acknowledge funding; doesn’t always capture quality Investigator counterfactual—would they would have done it anyway? Contribution vs. attribution

13 Measurement Challenges
Informal collection of qualitative data (e.g. getting out in the field, talking with grantees/members of the scientific community). Raises different issues: Diminishes rigor of systematic data collection May be bias based on what grantees think you want to hear How do we aggregate the progress reported by grantees and roll it up to better understand progress towards overarching initiative outcomes?

14 How does this all relate to open science?
Need to be able to measure research outputs as early and often as possible Limits of bibliometric analyses Lag time—what’s beginning to emerge? Missed learning Open access policy

15 How can we meet our information needs going forward?
Re-examining how we develop our outcomes Develop more meaningful, measurable interim milestones Incorporating expert scientific peer review panels

16 What can we do to improve the practices?
Working with other funders Evaluating our own practices and sharing lessons Convening science evaluators

17 Questions?


Download ppt "Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov."

Similar presentations


Ads by Google