Presentation is loading. Please wait.

Presentation is loading. Please wait.

Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin.

Similar presentations


Presentation on theme: "Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin."— Presentation transcript:

1 Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin

2 Today’s talk About EMBO A policy view of research assessment Stakeholder roles

3 About EMBO European Molecular Biology Organization (Maria Leptin, Director) Founded 1964, Heidelberg, DE Funded by the European Molecular Biology Conference –27 Member States –3 cooperation agreements Advancing policies for a world-class European research environment

4 Governance Three main areas: biotechnology, responsible conduct of research, scientific publishing –Technology assessment Scientific publishing –Open access –Data –Responsibilities of editors, administrators, authors Science Policy Programme

5

6 Scientific publishing The publication of scientific information is intended to move science forward. More specifically, the act of publishing is a quid pro quo in which authors receive credit and acknowledgment in exchange for disclosure of their scientific findings.

7 Journal name as proxy for quality Journal Impact Factor: a librarian’s number The concern is not use, but misuse –Research assessment –“JIF 38.597: a subscription for the price of the IF” Why has this been adopted for research assessment? –Cross-disciplinary –Intuitive and reflective –Prospective

8 Research assessment is an ecosystem Funders Researchers Journals Other assessors?

9 What DORA sets out Main recommendation: Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions Implementation?

10 What DORA sets out Research institutions and funding agencies: be clear on evaluation criteria and consider all contributions Publishers: do not use JIF as a marketing tool, make more article level metrics available, make all reference lists open, remove limits on reference list length

11 What DORA sets out Metrics suppliers: provide methodology and data in a useful form, account for variation in article types (reviews v. research articles) Researchers: as assessors, review for scientific content; as authors, cite appropriate (primary) literature; challenge bad practices

12 What DORA does not say Metrics based research assessment is wrong JIF is flawed for assessing journals Citations are a flawed metric There is a simple alternative Publishers are to blame Thomson Reuters is to blame

13 What DORA does not say Metrics based research assessment is wrong JIF is flawed for assessing journals Citations are a flawed metric There is a simple alternative Publishers are to blame X is to blame Altmetric Score

14

15

16

17 More institutions and funders emphasizing biosketches and 'select your 5 best papers' strategies over IF. Constructive discussions with Thomson Reuters. More interest in dialogue and a willingness to improve the JIF as a metric Competition is good for everyone Incremental advances

18 Engagement with funders Engaging additional research communities Study national/regional variations Editorials forthcoming –Key point: better analyses needed Policy analysis –Implementation and governance issues, metrics, stakeholders Incremental advances

19 This is not (just) about overworked or lazy promotion committees and rapacious journals The reward system in science is (becoming) warped Resources for thorough evaluation are not available Journal articles have become the currency of rewards rather than a contribution to knowledge It’s the system (?)

20 Researchers Publishers Research administrators Funders Metrics researchers Metrics providers Decision-makers Research Assessment: Stakeholders

21 We are great at measuring inputs (funding, numbers of students) We are good at measuring outputs (numbers of papers, some impact measures) Outcomes measurements are a problem What should we be assessing?

22 Papers –And how they are discovered? Data –And how they are discovered? Reviewing? Teaching? Committee work? Responsible conduct? What should we be assessing?

23 Workshops –Governance issues –Stakeholders Engagement with funders Ongoing work


Download ppt "Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin."

Similar presentations


Ads by Google