Presentation is loading. Please wait.

Presentation is loading. Please wait.

Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen,

Similar presentations


Presentation on theme: "Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen,"— Presentation transcript:

1 Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen, 10 March 2016

2 Formative effects of evaluation 1 1. Research projects 2. Teaching 3. Interventions & debate

3 2 KNOWSCIENCE project Open Data project

4 3

5 Increasing importance of metrics Strong tensions with central goals of European and national research policies to foster excellent and collaborative, socially responsible, and societally relevant science 4

6

7 Four main problems 1.The funding system 2.The career structure 3.The publication system 4.The evaluation system 6 Slide credit: Paul Wouters (CWTS)

8 Discrepancy between evaluation criteria and the social, cultural and economic functions of science The ‘Evaluation Gap’ Wouters, P.F. A key challenge: the evaluation gap. Blogpost, August 28, 2014. citationculture.wordpress.com

9 Literature review on effects of indicators (De Rijcke et al. 2015, Research Evaluation) Three possible consequences: 1.Goal displacement 2.Task reduction 3.Changes in relations government and institutions PBF – national systems - does trickle down to institutional and individual level 8

10 9

11 Thinking with Indicators in the Life Sciences Sarah de Rijcke Ruth Müller (TU Munich) Alex Rushforth (CWTS) Paul Wouters (CWTS)

12 One indicator:  the Journal Impact Factor Three steps in knowledge production: A.Planning research B.Collaboration and authorship practices C.Assessing work-in-progress manuscripts 11 Rushforth & De Rijcke (2015)

13 Theme: Planning research 12

14 Planning Research Selecting research questions I: What would you say is an ‚ideal‘ postdoc project? PHD_2M: One that gives me a good paper in a year! [laughter] Well, you can never entirely cancel out all risk. But what I mean with ‚risk‘ is how predictable it is what will come out of the research in a certain frame of time“

15 Planning Research Structuring research on the experimental level ‘You already need to plan in the very beginning what papers you will be able to publish, which experiments do I need for that. It sounds way more calculating than you think when you naively start your research career. But you just have to focus on what’s good for the papers.‘ (PDoc_6f, 1319).

16 Theme: Collaboration and authorship practices 15 Andy Lamb. Co-authorship network map of physicians publishing on hepatitis C (detail) https://www.flickr.com/photos/speedoflife/8274993170/

17 Collaboration and authorship practices Determining the safest bet Respondent: I just had a discussion with [PhD student X] on a project that's never going to be high impact. But then we have the choice; either publish it in a lower journal, or forget about it. And then, of course, we're also practical and say, "Okay, we have to publish it." Interviewer: Okay, yes. So you can decide whether to do more experiments on the basis of whether you think it stands a chance in a higher impact journal. Respondent: Of course, but then if we stick to [same PhD] as an example, she also has projects that are running really well. And so then, my problem, or something that I have to decide, is are we actually going to invest in that project that we don't think is very high impact, or are we going to try to publish it as it is, in a lower journal, so that she has all the time to work on the projects that are going well, and that do have an interesting set of results? (PI Interview, Surgical Oncology, Institute B) 16

18 Theme: assessing work-in-progress manuscripts 17

19 Assessing work-in-progress manuscripts Grading for novelty and quality PI goes to computer. “Any alternatives? Any journals?” PhD: Hmm maybe [Journal C]. They are similar in impact right? Post-doc: Yeah seven-ish. It’s difficult because some papers are descriptive and some have mechanism. So for this paper it could actually go one step higher than Journal C because you’re going a bit beyond description. They also have priority reports in [Journal B]. PI: [Journal D] also has very fast publishing periods from date of submission- if they like it of course. (Fieldnote 22 July 2014) 18

20 Conclusions Respondents’ ‘folk theories’ (Rip 2006) of indicators have important epistemic implications Affecting the types of work researchers consider viable and interesting Indicator-considerations eclipsed other judgments about work-in-progress Dominant way of attributing worth What kinds of ‘excellent’ science does this result in? Not incentivized to think about ‘responsible research’ or relevance to society 19

21 We need new assessment models to bridge the evaluation gap 20

22 21 A collaboration between Diana Hicks (Georgia Tech), Paul Wouters (CWTS), Ismael Rafols (SPRU/Ingenio), Sarah de Rijcke and Ludo Waltman (CWTS)

23 22

24 23

25 24 https://vimeo.com/133683418

26 25 www.leidenmanifesto.org

27 ## 26


Download ppt "Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen,"

Similar presentations


Ads by Google