Presentation is loading. Please wait.

Presentation is loading. Please wait.

Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and.

Similar presentations


Presentation on theme: "Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and."— Presentation transcript:

1 Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University Post-Conference Seminar, Vilnius, Lithuania September 25 th, 2013

2 Outline Policy context Proposed solutions Case study in a Dutch university Linking it together ! Conclusions, discussion, and future steps 1

3 Policy context 2

4 3 Overview of the organization of Dutch research evaluation Standard Evaluation Protocol (SEP – 2003, 2009) Association of Dutch Universities (VSNU) National Research Council (NWO) Royal Dutch Academy of Sciences (KNAW) Judging Research on its Merits (2005) Report “Quality indicators for research in the Humanities” (Committee on quality indicators for the humanities, November 2011). Report “Towards a framework for the quality assessment of social science research” (Committee on quality indicators for the social sciences, March 2013). Key issues that were addressed in both reports: – How to deal with heterogeneity? [without ‘standardizing’ it away] – Publication cultures – Societal relevance

5 Proposed solutions 4

6 5 Quality indicators for research in the Humanities

7 Quality AspectsAssessment criteriaIndicators Scholarly output Scholarly publications Scholarly use of output Evidence of scholarly recognition Articles Monographs Chapters in books Dissertations Other output Reviews Citations Other evidence of use Scholarly prizes Personal grants Other evidence of recognition

8 Quality Aspects Assessment criteriaIndicators Societal quality Societal publications Societal use of output Evidence of societal recognition Articles in specialist publications Monographs for a wider public Chapters in books for a wider public Other societal output Projects in collaboration with civil-society actors Contract research Demonstrable civil-society effects Other evidence of use Societal prizes Other evidence of Societal recognition

9 A case study in a Dutch University 8

10 9 Bibliometric analysis of output in a Dutch university: A case study on research output ‘04-’09 Scientific disciplines cover medicine, social sciences, law, philosophy, history, and economics & business. Publication data: internal output registration system (METIS), covering 2004-2009. Various types of scientific output were included. Purpose of the study: to analyze the ‘impact’ of the university.

11 10 Difference between the internal registration system & representation WoS Dominance university hospital in WoS realm extremely visible Law and Humanities ‘disappear’ in WoS realm

12 11 Composition of the output of the university in METIS: The external coverage of a university The category General is in some cases voluminous All units do have journal publications !

13 12 Does it have impact ? Taking all publications into consideration does not make any sense ! For two units international visibility increases!

14 Linking it together ! 13

15 Indicators Scholarly output Articles Monographs Chapters in books Dissertations Other output Reviews Citations Other evidence of use Scholarly prizes Personal grants Other evidence of recognition Journals Theses (WoS) Book Reviews WoS/Scopus/GS Citations Other Books Chapters Metis categories Scholarly publications Scholarly use of output Evidence of scholarly recognition Criteria Review committees, editorial boards, etc. Review committees, editorial boards, etc. Influencing other scholars

16 15 Indicators Articles in specialist publications Monographs for a wider public Chapters in books for a wider public Other societal output Projects in collaboration with civil-society actors Contract research Demonstrable civil-society effects Other evidence of use Societal prizes Other evidence of Societal recognition Societal quality Criteria Societal publications Societal use of output Evidence of societal recognition Non scholarly journals Monographs for a wider public Chapters in books for a wider public Media appearances Reports Other Participation in advisory councils, or the public debate Participation in advisory councils, or the public debate Media appearances Participation in advisory councils, or the public debate Participation in advisory councils, or the public debate Other Metis categories

17 Conclusions, Discussion, and future steps 16

18 17 Some conclusions of the study 1.The Metis data clearly showed the possibilities to link the scientific outlets (registered in Metis) to the proposed assessment schemes. 2.… which also allows to focus on societal quality ! 3.Working on an environment that assists research assessments in the SSH should be done in close collaboration with the scholarly community involved. 4.Citation analysis of non WoS source material seemed a fruitful approach.

19 18 Some recommendations … 1.The next challenge is the adding of the possible audiences of the various outlets now linked to the indicators. 2.In addition to this search for the audiences, inevitably the request for ‘value-ing’ the various indicators will pop up ! 3.Challenge in the design of indicators based on such a system is to avoid thinking of this as a numbers game. 4.National discipline-wide initiative to register research output and societal impact seems called for …

20 19 Future steps… 1.We have inquired the possibilities to conduct a follow-up study within Leiden University, to further improve the methodology and discuss the outcomes with researchers and research managers. 2.We have planned a Workshop to discuss the possibilities to come to a national system of data collection that could support assessment procedures as shown in this presentation.

21 Desert ! 20

22 Development of authorship across all domains of scholarly activity 21

23 Definitions of JIF and Hirsch Index Definition of JIF: – The mean citation score of a journal, determined by dividing all citations in year T by all citable documents in years T-1 and T-2. Definition of h-index: – The ‘impact’ of a researcher, determined by the number of received citations of an oeuvre, sorted by descending order, where the number of citations equals the rank position.

24 Problems with JIF Some methodological problems of JIF: – Was/is calculated erroneously. – Not field normalized. – Not document type normalized. – Underlying citation distributions are highly skewed Some conceptual problems of JIF: – Inflates the impact of all researchers publishing in the journal. – Promotes journal publishing, as JIF is easily measured. – Stimulates one-indicator thinking. – Is based on expected values only, does not relate to reality. – Ignores other scholarly virtues.

25 Problems with H-index Some bibliometric-mathematical problems of H- index: – Is mathematically inconsistent in its’ behavior. – Tends to rise only, no decrease possible, and thus conservative by nature. – Not field normalized. Some bibliometric-methodological problems of H- index: – How to define an author? – In which bibliographic/metric environment? Some conceptual problems of H-index: – Is biased against youth, and favors age and experience. – Is biased against selective researchers, and favors highly productive scientists. – No relationship between H-index and research quality. – Ignores other elements of scholarly activity. – Promotes one-indicator thinking.

26 Thank you for your attention! Any questions? Ask me, or mail me leeuwen@cwts.nl 25

27 Appendix on H-index 26

28 The H-Index and its limitations

29 The H-Index, defined as … The H-Index is the score that indicates the position at which a publication in a set, the number of received citations is equal to the ranking position of that publication. Idea of an American physicist, J. Hirsch, who published about this index in the Proc. NAS USA.

30 Examples of Hirsch-index values Environmental biologist, output of 188 papers, cited 4,788 times in the period 80-04. Hirsch-index value of 31 Clinical psychologist, output of 72 papers, cited 760 time sin the period 80-04. Hirsch-index value of 14

31 Actual versus field normalized impact (CPP/FCSm) displayed against the output. Large output can be combined with a relatively low impact

32 H-Index displayed against the output. Larger output is strongly correlated with a high H- Index value.

33 Consistency: Definition Definition. A scientific performance measure is said to be consistent if and only if for any two actors A and B and for any number n ≥ 0 the ranking of A and B given by the performance measure does not change when A and B both have a new publication with n citations. 32

34 Consistency: Motivation Consistency ensures that if the publishing behavior of two actors does not change over time, their ranking relative to each other also does not change Consistency ensures that if the individual researchers in one research group X outperform the individual researchers in another research group Y, the former research group X as a whole outperforms the latter research group Y. 33

35 Inconsistency of the h-index 34 Actor AActor B h = 4h = 6 h = 8

36 Problems with the H-Index For serious evaluation of scientific performance, the H-Index is as indicator not suitable, as the index: –Is insensitive to field specific characteristics (e.g., difference in citation cultures between medicine and other disciplines). –Does not take into account age and career length of scientists, a small oeuvre leads necessarily to a low H-Index value. –Is inconsistent in its ‘behaviour’.

37 Appendix on JFIS 36

38 Other journal impact measures … JFIS (CWTS) Journal-to-Field Impact Score – A field- and document type normalized journal impact score, based on more publication data and longer citation windows.

39 Journals within their JFIS-values ------------------------------------------------------------------------------------------------------------------------------------------ JOURNALJFISRanking - Field ------------------------------------------------------------------------------------------------------------------------------------------ CELL 7.06( 1 - Biochem & Mol Biol) REV MOD PHYSICS 5.15( 2 - Physics) ANN REV CELL DEV BIOL 5.04( 3 - Biochem & Mol Biol) CHEMICAL REVIEWS 4.90( 4 - Chemistry) NATURE MEDICINE 4.73( 5 - Medicine) ANN REV OF BIOCHEM4.64( 6 - Biochem & Mol Biol) ANNALS OF MATHEMATICS 4.46( 7 - Mathematics) NATURE BIOTECHNOLOGY 4.07( 8 - Biotech & Appl Microb) ACTA MATHEMATICA 4.01( 9 - Mathematics) BULL AM MATH SOC4.00( 10 - Mathematics) ANN REV CELL BIOL3.78( 11 - Biochem & Mol Biol) J AM MATH SOC 3.71( 12 - Mathematics) J ROYAL STAT SOC B3.49( 13 - Statistics & prob) PROG CHEM ORG NAT PROD3.35( 14 - Organic Chem) ACTA METALL MATER3.19( 15 - Metall & Met Eng) ANGEW CHEM-INT EDIT3.15( 16 - Chemistry) PHYS REV LETT 3.13( 17 - Physics) J MICROELECTROMECH SYST 3.04( 18- Elec & Electr Eng) J RHEOLOGY 3.02( 19 - Mechanics) INVENT MATH3.01( 20 - Mathematics) ------------------------------------------------------------------------------------------------------------------------------


Download ppt "Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and."

Similar presentations


Ads by Google