Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov.

Slides:



Advertisements
Similar presentations
QAA Research Teaching Linkages: Enhancing Graduate Attributes Theme Linda Juleff, QAA Steering Group Representative.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
Monitoring and Evaluation in the CSO Sector in Ghana
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Capturing the impact of research Briony Rayfield.
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
NAVIGATING THE WATERS: USING ASSESSMENT TO MAKE A DIFFERENCE Amy Harper, Area Coordinator, Fordham University Greer Jason, PhD, Assistant Dean of Students,
The CIHR Perspective on Impact Measurement
Annual Public Meeting 1 September, ASB Community Trust  Established in 1988 as a result of the sale of the Auckland Savings Bank  15 Trustees.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Measuring Sustainable development: Achievements and Challenges Enrico Giovannini OECD Chief Statistician June 2005.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Workshop #1 Writing Quality Formative and Performance Based Assessments for MS Science.
TCE Clinic Consortia Policy and Advocacy Evaluation Clinic Consortia Policy Advocacy: Increased Clinic Financial Stability or Follow the Money Annette.
Tactics for improving evidence base. TACTICS OF ALL TACTICS FOR IMPROVING EVIDENCE BASE: YOU, ME, OUR COLLEAGUES If we understand the needs for evidence.
Stages of Research and Development
Annual Review Meeting for Cohesion Policy Funds Specific Thematic Focus 2 - Review the status and state of play of smart specialization strategy.
Using Data to Drive Decision-Making
Monitoring and Evaluating Rural Advisory Services
Building a Sustainable Funding Model
Designing Effective Evaluation Strategies for Outreach Programs
Innovation Ecosystems Fellowship Overview
Monitoring, Evaluation and Learning
THE SELF SUSTAINING NON-PROFIT Golden Lessons From the Development and Corporate Sectors 14th Eastern Africa Resource Mobilization Workshop Paper.
Achieving the Dream Mark A. Smith.
Long Term Impacts of Research Capacity Building in Global Health
Quality Case Practice Improvement
Programme Board 6th Meeting May 2017 Craig Larlee
Collective Impact Fall 2017.
Monitoring and Evaluating Rural Advisory Services
Building a Digital Ready Workforce
Governance and leadership roles for equality and diversity in Colleges
Building a Culture of Learning
Continuous Improvement Plan (CIP)
Lessons from the Breaking Through Initiative
Logic Models and Theory of Change Models: Defining and Telling Apart
What’s New at ACCJC? PRESENTERS
Introducing Science.
CBP Biennial Strategy Review System ~Meetings Detail~ DRAFT August 29, /6/2018 DRAFT.
A Leadership Resource for Patient and Family Engagement Strategies
RMAPI Town Hall Meeting
Introduction to the training
Towards Excellence in Research: Achievements and Visions of
The Curry School of Education October 3, 2018
Introduction to M&E Frameworks
This content is available under a Creative Commons Attribution License
Monitoring, Evaluation and Learning
Workshop Plenary Maintaining Protected Areas for Now and the Future
Northern California Grantmakers Values-Based Grantmaking Practice
Results Based Management for Monitoring & Evaluation
Building an Informatics-Savvy Health Department
Understanding Impact Stephanie Seavers, Impact Manager.
RIA Foundations Jonathan Grant.
Wednesday 13 September UKCF Conference Cardiff
Quality Framework Overview
Data for PRS Monitoring: Institutional and Technical Challenges
Community Benefit Activities
Presentation transcript:

Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov

Gordon and Betty Moore Foundation Fosters path-breaking scientific discovery, environmental conservation, patient care improvements, and preservation of the character of the San Francisco Bay Area Founded November 2000 Endowment of $6.4 billion

Gordon and Betty Moore Foundation “We want the foundation to tackle large, important issues at a scale where it can achieve significant and measurable impacts.” –Gordon Moore

Science Program Seeks to advance basic science through developing new technologies, supporting imaginative scientists, and creating new collaborations at the frontiers of traditional scientific disciplines. Courtesy of Princeton University We fund research designed to: Advance our understanding of the world by asking new questions Enable new science through advances in technology Break down barriers and cultivate collaborations Enhance society’s understanding of the joy of science 3

Science Program Areas Marine Microbiology Initiative Data-Driven Discovery Initiative Emergent Phenomena in Quantum Systems Initiative Thirty Meter Telescope Imaging Science Learning Astronomy Special Projects Courtesy of Sossina Haile

Measurement, Evaluation and Learning in Philanthropy Measurement: internal process of gathering information to monitor progress in the implementation of our work; occurs on an ongoing basis Evaluation: periodic assessments of ongoing or completed work; conducted both internally or by an external third party Learning: using data and insights from measurement and evaluation to inform strategy and decision-making

Measurement, Evaluation and Learning at Moore Responsible for ensuring access to the best evaluation, data, knowledge management, measurement systems and practices that support evidence-based decision-making Examples: Working with Science program staff to develop measurement frameworks Designing and managing external evaluations Facilitating internal reviews Field-building

Developing a Funding/Evaluation Strategy Unit of funding (e.g. individuals, institutions, projects) Risk—supporting risky research is often a niche for philanthropies Integrating both financial and non-financial supports

How do we monitor and evaluate our investments? Grantee requirements Annual reports—self-reported data Meetings/site visits Track most of the quantitative data to measure scientific output (e.g. publications, presentations, # of instruments developed, etc.) Internal research/strategy reviews External evaluations

Conceptual Challenges for Monitoring & Evaluation Basic science does not follow a linear path Difficulty of setting up measurement framework prospectively for outcomes that cannot be precisely defined Tension of setting aspirational outcomes while being realistic about what’s achievable during the life of portfolio

Conceptual Challenges for Monitoring & Evaluation Many of our initiatives have strategies aimed at developing new ways of thinking, or changing the culture of a research community. Challenge of capturing the nuances of progress towards these types of outcomes Timescale issues Large initiatives typically approved for 5-7 years Conduct evaluations ~4-5 years into the life of the initiative Ultimate impact not expected until many years later

Measurement Challenges Most data self-reported—how can we objectively measure progress of grants? Limited baseline data—how do we collect this for the “state of the field?” Bibliometrics—failure of grantees to acknowledge funding; doesn’t always capture quality Investigator counterfactual—would they would have done it anyway? Contribution vs. attribution

Measurement Challenges Informal collection of qualitative data (e.g. getting out in the field, talking with grantees/members of the scientific community). Raises different issues: Diminishes rigor of systematic data collection May be bias based on what grantees think you want to hear How do we aggregate the progress reported by grantees and roll it up to better understand progress towards overarching initiative outcomes?

How does this all relate to open science? Need to be able to measure research outputs as early and often as possible Limits of bibliometric analyses Lag time—what’s beginning to emerge? Missed learning Open access policy

How can we meet our information needs going forward? Re-examining how we develop our outcomes Develop more meaningful, measurable interim milestones Incorporating expert scientific peer review panels

What can we do to improve the practices? Working with other funders Evaluating our own practices and sharing lessons Convening science evaluators

Questions? Julia.Klebanov@moore.org