Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen,

Slides:



Advertisements
Similar presentations
Supporting further and higher education Setting the scene Rhona Sharpe Learner Experience Support Project.
Advertisements

Guideline for discussion/presentation/critique #1: Really understand the paper …
My Project Presentation
Workshop 4.  Welcome  Questions/ queries  Outline of the day’s programme.
Establishing Research Priorities for Public Health Emergency Preparedness in Canada: Results of a Scoping Review and Priority- Setting Meeting Yasmin Khan,
Using concept mapping to develop a career studies curriculum My objectives in this presentation are to: 1/ Share and explain a career studies concept map.
Good Research Questions. A paradigm consists of – a set of fundamental theoretical assumptions that the members of the scientific community accept as.
Friday, November 14 and Monday, November 17 Evaluating Scientific Argument: Peer Review IPHY 3700 Writing Process Map.
Wirtschaftsinftik Research Themes & Priorities - Research Policy Forum Stefan Klein.
1 The centrality of assessment “The spirit and style of student assessment define the de facto curriculum.” (Rowntree, 1977) “Assessment will often swamp.
Why get a Ph.D? You like the title of “Dr. Professor.” You never want to leave the University. You want to teach. You want a research career.
Creativity? Is that what they’re after?. Creativity to go … …Radical ? …Regular ? …Specialist ? …General ?
Performance-based funding of public research in tertiary education institutions: Country experiences Presentation to the Norwegian Fagerberg Committee:
Problem solving in project management
Why Return on Investment (ROI) Matters Raimo Vuorinen presenting for: James P. Sampson, Jr. Florida State University.
© Curriculum Foundation1 Section 2 The nature of the assessment task Section 2 The nature of the assessment task There are three key questions: What are.
PRESENTING A PROTOCOL AN IRB INFOSHORT FEBRUARY 2013.
Northampton – Development Opportunities a framework for enabling positive change.
Welcome... Simon Walls PhD Marketing School of Business Administration.
UNIT 1 PERIOD STUDY FROM OVERVIEW OF THE PERIOD STUDY PROVISION FOR 2015 SOME SPECIFIC ISSUES STRUCTURE OF THIS SESSION.
NYIT TUES Evaluation May 2014 Dr. Sarah McPherson Evaluator NYIT School of Education.
Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating.
Research in archives and records management Dr Elizabeth Shepherd
Valuing evaluation: A Case Study of Professional Development to Support Academic Engagement in Online Evaluation Processes and Outcomes Dr. Diana Quinn.
Writing a research paper in science/physics education The first episode! Apisit Tongchai.
Graduate School of Education Leading, Learning, Life Changing Evolving Oregon Educational Policy Courtesy of Pat Burk, Ph.D. Department of Educational.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
The Scholarship of Engagement for Politics Barrie Axford Oxford Brookes University.
HEA Conference June 22nd – 23rd 2010 Shaping the Future: Future Learning It’s all in the words: the impact of language on the design and development of.
Hannover 9 February  Important professional, ethical, regulatory, cultural and other factors interact with financial incentives to influence provider.
Modern studies higher Question Stems.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Student Self-Perceptions Prior to Intervention at the University of Virginia Dean L. Stevenson Virginia Tech Research Performed at the University of Virginia.
Society of Archivists Conference, Lancaster, 5-8 September 2006 An archives and records management research network (ARMReN) for the UK: plans, activities.
The Market, Higher Education and Inequality Pedagogic quality and inequality in undergraduate degrees ( )
Peer review of digital resources for the arts and humanities David Bates and Jane Winters.
A survey based analysis on training opportunities Dr. Jūratė Kuprienė Framing the digital curation curriculum International Conference Florence, Italy.
Online Course Evaluations Is there a perfect time? Presenters: Cassandra Jones, Ph.D., Director of Assessment Michael Anuszkiewicz, Research Associate.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Dr Ritva Dammert Director Brussels May 27, 2009 Evaluation of the Finnish Centres of Excellence Programmes
Scottish Qualifications Authority National Qualifications Group Awards: 2009 Conference Dr John Allan Curriculum for Excellence and NQGAs.
Students seizing responsibility: A revolution of collegiality Amie Speirs, Zoe Welsh, Julia Jung and Jenny Scoles Introduction: In our project Students.
Learning Targets January 21, 2008 Londa Richter & Jo Hartmann TIE.
From description to analysis
Publications Dr Sarah Wendt. Context PhD conferred Oct 2005, fulltime lecturer in 2006 at UniSA (40/40/20). Promoted to Senior Lecturer (2010). Social.
Holly Wang Workshop at CAU December 15, 2010 Conducting Empirical Research and Publishing in International Journals.
Graduate School of Education Leading, Learning, Life Changing Emerging Trends in K-12 Education in Oregon Patrick Burk, PH.D. Educational Leadership and.
Quick Ways to Support Diverse Learners. Supporting Gifted Students.
FEMS Microbiology Ecology Getting Your Work Published Telling a Compelling Story Working with Editors and Reviewers Jim Prosser Chief Editor FEMS Microbiology.
20 September 2007© University of Reading SRC Report – July 2007 Individual Research Plans Systems Engineering Presentation for Academic Staff Prof.
Prioritizing Standards How do we decide what matters most?
Embedding Quality in Student Assessment in ODL: The case of the University of Pretoria Dr Ruth Aluko (University of Pretoria) & Dr Ephraim Mhlanga (SAIDE)
Resistance of a Wire Coursework Practical Investigation
Studying. Move Beyond Memorization Instructors expect you to have a deeper understanding of principles, and will ask you to apply the principle to problems.
CENTRE FOR EXCELLENCE IN TEACHING & LEARNING ASSESSMENT FOR LEARNING CETL Associates Project Angelina Wilson and Nicola Reimann CENTRE FOR EXCELLENCE IN.
Researching documents as active texts in social work Jo Warner University of Kent 10 th UK Joint Social Work Education Conference with 2 nd UK Social Work.
Journal #1 February 5 and 6, 2015 How would you describe your personality? Try to think of 8-10 descriptive words and think about what made you choose.
Conducting research in UK HE Tristram Hooley.
Leadership Guide for Strategic Information Management Leadership Guide for Strategic Information Management for State DOTs NCHRP Project Information.
Rosemarie Bernabe, PhD Julius Center for Health Sciences and Primary Care Patient representatives’ contributions to the benefit-risk assessment tasks of.
Midway evaluation Your name. Title of your PhD work Title.
CfE Higher Modern Studies Overview of course (3 units):
Knowing science Synopsis of the state of the art based on collected research results of the team.
An archives and records management research network (ARMReN) for the UK: plans, activities and prospects Dr Elizabeth Shepherd School of Library, Archive.
HIGHER history Welcome 
Tamara L. Sims, MA1, Jeanne L. Tsai, PhD1 and Mary K
What Does Responsible Metrics Mean?
Research Instruments By: Dr. Matthew Kidder.
IBDP MUSIC COURSE Information
Social Impact: Motivations of ECRs
Presentation transcript:

Accounting for Impact Evaluation practices and the Effects of Indicator Use Dr. Sarah de Rijcke Centre for Science and Technology Studies (CWTS) Copenhagen, 10 March 2016

Formative effects of evaluation 1 1. Research projects 2. Teaching 3. Interventions & debate

2 KNOWSCIENCE project Open Data project

3

Increasing importance of metrics Strong tensions with central goals of European and national research policies to foster excellent and collaborative, socially responsible, and societally relevant science 4

Four main problems 1.The funding system 2.The career structure 3.The publication system 4.The evaluation system 6 Slide credit: Paul Wouters (CWTS)

Discrepancy between evaluation criteria and the social, cultural and economic functions of science The ‘Evaluation Gap’ Wouters, P.F. A key challenge: the evaluation gap. Blogpost, August 28, citationculture.wordpress.com

Literature review on effects of indicators (De Rijcke et al. 2015, Research Evaluation) Three possible consequences: 1.Goal displacement 2.Task reduction 3.Changes in relations government and institutions PBF – national systems - does trickle down to institutional and individual level 8

9

Thinking with Indicators in the Life Sciences Sarah de Rijcke Ruth Müller (TU Munich) Alex Rushforth (CWTS) Paul Wouters (CWTS)

One indicator:  the Journal Impact Factor Three steps in knowledge production: A.Planning research B.Collaboration and authorship practices C.Assessing work-in-progress manuscripts 11 Rushforth & De Rijcke (2015)

Theme: Planning research 12

Planning Research Selecting research questions I: What would you say is an ‚ideal‘ postdoc project? PHD_2M: One that gives me a good paper in a year! [laughter] Well, you can never entirely cancel out all risk. But what I mean with ‚risk‘ is how predictable it is what will come out of the research in a certain frame of time“

Planning Research Structuring research on the experimental level ‘You already need to plan in the very beginning what papers you will be able to publish, which experiments do I need for that. It sounds way more calculating than you think when you naively start your research career. But you just have to focus on what’s good for the papers.‘ (PDoc_6f, 1319).

Theme: Collaboration and authorship practices 15 Andy Lamb. Co-authorship network map of physicians publishing on hepatitis C (detail)

Collaboration and authorship practices Determining the safest bet Respondent: I just had a discussion with [PhD student X] on a project that's never going to be high impact. But then we have the choice; either publish it in a lower journal, or forget about it. And then, of course, we're also practical and say, "Okay, we have to publish it." Interviewer: Okay, yes. So you can decide whether to do more experiments on the basis of whether you think it stands a chance in a higher impact journal. Respondent: Of course, but then if we stick to [same PhD] as an example, she also has projects that are running really well. And so then, my problem, or something that I have to decide, is are we actually going to invest in that project that we don't think is very high impact, or are we going to try to publish it as it is, in a lower journal, so that she has all the time to work on the projects that are going well, and that do have an interesting set of results? (PI Interview, Surgical Oncology, Institute B) 16

Theme: assessing work-in-progress manuscripts 17

Assessing work-in-progress manuscripts Grading for novelty and quality PI goes to computer. “Any alternatives? Any journals?” PhD: Hmm maybe [Journal C]. They are similar in impact right? Post-doc: Yeah seven-ish. It’s difficult because some papers are descriptive and some have mechanism. So for this paper it could actually go one step higher than Journal C because you’re going a bit beyond description. They also have priority reports in [Journal B]. PI: [Journal D] also has very fast publishing periods from date of submission- if they like it of course. (Fieldnote 22 July 2014) 18

Conclusions Respondents’ ‘folk theories’ (Rip 2006) of indicators have important epistemic implications Affecting the types of work researchers consider viable and interesting Indicator-considerations eclipsed other judgments about work-in-progress Dominant way of attributing worth What kinds of ‘excellent’ science does this result in? Not incentivized to think about ‘responsible research’ or relevance to society 19

We need new assessment models to bridge the evaluation gap 20

21 A collaboration between Diana Hicks (Georgia Tech), Paul Wouters (CWTS), Ismael Rafols (SPRU/Ingenio), Sarah de Rijcke and Ludo Waltman (CWTS)

22

23

24

25

## 26