Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and.

Slides:



Advertisements
Similar presentations
SCOPUS Searching for Scientific Articles By Mohamed Atani UNEP.
Advertisements

Relevance and Impact of Humanities Research and how ERiC can help Jack Spaapen Royal Netherlands Academy of Arts and Sciences Humanities Conference Vienna.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
What are the characteristics of academic journals
Measuring Science (II) Morten Brendstrup-Hansen. No science without scientific publications Scientific publications are direct and tangible products of.
N EW WAYS TO TRACK SCHOLARLY PRODUCTIVITY : T HE H - AND G - INDICES.
SCIENTROMETRIC By Preeti Patil. Introduction The twentieth century may be described as the century of the development of metric science. Among the different.
Research evaluation at CWTS Meaningful metrics, evaluation in context
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 6 Finding the Evidence: Informational Sources, Search Strategies, and Critical.
Rankings: What do they matter, what do they measure? Anne McFarlane August 18, 2010.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Journal Metrics Iran Research Excellence Forum Tehran, October 2014
Using Journal Citation Reports The MyRI Project Team.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Journal Impact Factors and H index
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
INFORMATION SOLUTIONS Mary L. Van Allen 21 September 2005 Open Access Journals and citation patterns International Seminar on Open Access for Developing.
The Web of Science database bibliometrics and alternative metrics
Welcome to Scopus Training by : Arash Nikyar June 2014
Institute of Information Technology of ANAS Rahila Hasanova "New Challenges in the European Area: International Baku Forum of Young Scientists.
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
1 Scopus as a Research Tool March Why Scopus?  A comprehensive abstract and citation database of peer-reviewed literature and quality web sources.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
How to Write a Critical Review of Research Articles
CHAPTER 15, READING AND WRITING SOCIAL RESEARCH. Chapter Outline  Reading Social Research  Using the Internet Wisely  Writing Social Research  The.
Detection of different types of bibliometric performance at the individual level in the Life Sciences: methodological outline Rodrigo Costas & Ed Noyons.
T H O M S O N S C I E N T I F I C Marian Hollingsworth Manager, Publisher Relations July 18, 2007 Using Metrics to Improve your Journal Veterinary Journal.
Journal Impact Factors and the Author h-index:
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Web of Science® Krzysztof Szymanski October 13, 2010.
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
ISC Journal Citation Reprots تقارير استنادية للمجلات Mohammad Reza – Ghane Assistant Prof. in Library and Information Science & Director of Research Department.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Journal Impact Factors: What Are They & How Can They Be Used? Pamela Sherwill, MLS, AHIP April 27, 2004.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
Bibliometric assessment of research performance in social sciences and humanities Henk F. Moed Centre for Science and Technology Studies (CWTS), Leiden.
Bibliometric Analysis in Social Sciences and Humanities: paradigm shift and mutual challenges João Costa Investigação e Ensino na NOVA June 29, 2012.
LITERATURE REVIEW  A GENERAL GUIDE  MAIN SOURCE  HART, C. (1998), DOING A LITERATURE REVIEW: RELEASING THE SOCIAL SCIENCE RESEARCH IMAGINATION.
EuroCRIS Platform Meeting - Vienna 2-3 October 1998 CRIS as a source for tracking science publication patterns Fulvio Naldi - Carlo Di Mento Italian National.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
1 e-Resources on Social Sciences: Scopus. 2 Why Scopus?  A comprehensive abstract and citation database of peer-reviewed literature and quality web sources.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Abstract  An abstract is a concise summary of a larger project (a thesis, research report, performance, service project, etc.) that concisely describes.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Where Should I Publish? Journal Ranking Tools
Assessing the impact of SSH: The Dutch approach
Literature Review: Conception to Completion
Advanced Scientometrics Workshop
Bibliometric Analysis of Process Safety and Environmental Protection
Presentation transcript:

Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University Post-Conference Seminar, Vilnius, Lithuania September 25 th, 2013

Outline Policy context Proposed solutions Case study in a Dutch university Linking it together ! Conclusions, discussion, and future steps 1

Policy context 2

3 Overview of the organization of Dutch research evaluation Standard Evaluation Protocol (SEP – 2003, 2009) Association of Dutch Universities (VSNU) National Research Council (NWO) Royal Dutch Academy of Sciences (KNAW) Judging Research on its Merits (2005) Report “Quality indicators for research in the Humanities” (Committee on quality indicators for the humanities, November 2011). Report “Towards a framework for the quality assessment of social science research” (Committee on quality indicators for the social sciences, March 2013). Key issues that were addressed in both reports: – How to deal with heterogeneity? [without ‘standardizing’ it away] – Publication cultures – Societal relevance

Proposed solutions 4

5 Quality indicators for research in the Humanities

Quality AspectsAssessment criteriaIndicators Scholarly output Scholarly publications Scholarly use of output Evidence of scholarly recognition Articles Monographs Chapters in books Dissertations Other output Reviews Citations Other evidence of use Scholarly prizes Personal grants Other evidence of recognition

Quality Aspects Assessment criteriaIndicators Societal quality Societal publications Societal use of output Evidence of societal recognition Articles in specialist publications Monographs for a wider public Chapters in books for a wider public Other societal output Projects in collaboration with civil-society actors Contract research Demonstrable civil-society effects Other evidence of use Societal prizes Other evidence of Societal recognition

A case study in a Dutch University 8

9 Bibliometric analysis of output in a Dutch university: A case study on research output ‘04-’09 Scientific disciplines cover medicine, social sciences, law, philosophy, history, and economics & business. Publication data: internal output registration system (METIS), covering Various types of scientific output were included. Purpose of the study: to analyze the ‘impact’ of the university.

10 Difference between the internal registration system & representation WoS Dominance university hospital in WoS realm extremely visible Law and Humanities ‘disappear’ in WoS realm

11 Composition of the output of the university in METIS: The external coverage of a university The category General is in some cases voluminous All units do have journal publications !

12 Does it have impact ? Taking all publications into consideration does not make any sense ! For two units international visibility increases!

Linking it together ! 13

Indicators Scholarly output Articles Monographs Chapters in books Dissertations Other output Reviews Citations Other evidence of use Scholarly prizes Personal grants Other evidence of recognition Journals Theses (WoS) Book Reviews WoS/Scopus/GS Citations Other Books Chapters Metis categories Scholarly publications Scholarly use of output Evidence of scholarly recognition Criteria Review committees, editorial boards, etc. Review committees, editorial boards, etc. Influencing other scholars

15 Indicators Articles in specialist publications Monographs for a wider public Chapters in books for a wider public Other societal output Projects in collaboration with civil-society actors Contract research Demonstrable civil-society effects Other evidence of use Societal prizes Other evidence of Societal recognition Societal quality Criteria Societal publications Societal use of output Evidence of societal recognition Non scholarly journals Monographs for a wider public Chapters in books for a wider public Media appearances Reports Other Participation in advisory councils, or the public debate Participation in advisory councils, or the public debate Media appearances Participation in advisory councils, or the public debate Participation in advisory councils, or the public debate Other Metis categories

Conclusions, Discussion, and future steps 16

17 Some conclusions of the study 1.The Metis data clearly showed the possibilities to link the scientific outlets (registered in Metis) to the proposed assessment schemes. 2.… which also allows to focus on societal quality ! 3.Working on an environment that assists research assessments in the SSH should be done in close collaboration with the scholarly community involved. 4.Citation analysis of non WoS source material seemed a fruitful approach.

18 Some recommendations … 1.The next challenge is the adding of the possible audiences of the various outlets now linked to the indicators. 2.In addition to this search for the audiences, inevitably the request for ‘value-ing’ the various indicators will pop up ! 3.Challenge in the design of indicators based on such a system is to avoid thinking of this as a numbers game. 4.National discipline-wide initiative to register research output and societal impact seems called for …

19 Future steps… 1.We have inquired the possibilities to conduct a follow-up study within Leiden University, to further improve the methodology and discuss the outcomes with researchers and research managers. 2.We have planned a Workshop to discuss the possibilities to come to a national system of data collection that could support assessment procedures as shown in this presentation.

Desert ! 20

Development of authorship across all domains of scholarly activity 21

Definitions of JIF and Hirsch Index Definition of JIF: – The mean citation score of a journal, determined by dividing all citations in year T by all citable documents in years T-1 and T-2. Definition of h-index: – The ‘impact’ of a researcher, determined by the number of received citations of an oeuvre, sorted by descending order, where the number of citations equals the rank position.

Problems with JIF Some methodological problems of JIF: – Was/is calculated erroneously. – Not field normalized. – Not document type normalized. – Underlying citation distributions are highly skewed Some conceptual problems of JIF: – Inflates the impact of all researchers publishing in the journal. – Promotes journal publishing, as JIF is easily measured. – Stimulates one-indicator thinking. – Is based on expected values only, does not relate to reality. – Ignores other scholarly virtues.

Problems with H-index Some bibliometric-mathematical problems of H- index: – Is mathematically inconsistent in its’ behavior. – Tends to rise only, no decrease possible, and thus conservative by nature. – Not field normalized. Some bibliometric-methodological problems of H- index: – How to define an author? – In which bibliographic/metric environment? Some conceptual problems of H-index: – Is biased against youth, and favors age and experience. – Is biased against selective researchers, and favors highly productive scientists. – No relationship between H-index and research quality. – Ignores other elements of scholarly activity. – Promotes one-indicator thinking.

Thank you for your attention! Any questions? Ask me, or mail me 25

Appendix on H-index 26

The H-Index and its limitations

The H-Index, defined as … The H-Index is the score that indicates the position at which a publication in a set, the number of received citations is equal to the ranking position of that publication. Idea of an American physicist, J. Hirsch, who published about this index in the Proc. NAS USA.

Examples of Hirsch-index values Environmental biologist, output of 188 papers, cited 4,788 times in the period Hirsch-index value of 31 Clinical psychologist, output of 72 papers, cited 760 time sin the period Hirsch-index value of 14

Actual versus field normalized impact (CPP/FCSm) displayed against the output. Large output can be combined with a relatively low impact

H-Index displayed against the output. Larger output is strongly correlated with a high H- Index value.

Consistency: Definition Definition. A scientific performance measure is said to be consistent if and only if for any two actors A and B and for any number n ≥ 0 the ranking of A and B given by the performance measure does not change when A and B both have a new publication with n citations. 32

Consistency: Motivation Consistency ensures that if the publishing behavior of two actors does not change over time, their ranking relative to each other also does not change Consistency ensures that if the individual researchers in one research group X outperform the individual researchers in another research group Y, the former research group X as a whole outperforms the latter research group Y. 33

Inconsistency of the h-index 34 Actor AActor B h = 4h = 6 h = 8

Problems with the H-Index For serious evaluation of scientific performance, the H-Index is as indicator not suitable, as the index: –Is insensitive to field specific characteristics (e.g., difference in citation cultures between medicine and other disciplines). –Does not take into account age and career length of scientists, a small oeuvre leads necessarily to a low H-Index value. –Is inconsistent in its ‘behaviour’.

Appendix on JFIS 36

Other journal impact measures … JFIS (CWTS) Journal-to-Field Impact Score – A field- and document type normalized journal impact score, based on more publication data and longer citation windows.

Journals within their JFIS-values JOURNALJFISRanking - Field CELL 7.06( 1 - Biochem & Mol Biol) REV MOD PHYSICS 5.15( 2 - Physics) ANN REV CELL DEV BIOL 5.04( 3 - Biochem & Mol Biol) CHEMICAL REVIEWS 4.90( 4 - Chemistry) NATURE MEDICINE 4.73( 5 - Medicine) ANN REV OF BIOCHEM4.64( 6 - Biochem & Mol Biol) ANNALS OF MATHEMATICS 4.46( 7 - Mathematics) NATURE BIOTECHNOLOGY 4.07( 8 - Biotech & Appl Microb) ACTA MATHEMATICA 4.01( 9 - Mathematics) BULL AM MATH SOC4.00( 10 - Mathematics) ANN REV CELL BIOL3.78( 11 - Biochem & Mol Biol) J AM MATH SOC 3.71( 12 - Mathematics) J ROYAL STAT SOC B3.49( 13 - Statistics & prob) PROG CHEM ORG NAT PROD3.35( 14 - Organic Chem) ACTA METALL MATER3.19( 15 - Metall & Met Eng) ANGEW CHEM-INT EDIT3.15( 16 - Chemistry) PHYS REV LETT 3.13( 17 - Physics) J MICROELECTROMECH SYST 3.04( 18- Elec & Electr Eng) J RHEOLOGY 3.02( 19 - Mechanics) INVENT MATH3.01( 20 - Mathematics)