Ilya Ponomarev 1, Pawel Sulima 1, Jodi Basner 1, Unni Jensen 1, Joshua Schnell 1, Karen Jo 2, and Nicole Moore 2 A New Approach for Automated Author Discipline.

Slides:



Advertisements
Similar presentations
INFORMATION SOLUTIONS Citation Analysis Reports. Copyright 2005 Thomson Scientific 2 INFORMATION SOLUTIONS Provide highly customized datasets based on.
Advertisements

Using Incites to evaluate Research Performance Advanced
Collaboration, Competition and the Global Drivers of Research Collaborative Research: Trends and Future Directions Goddard Space Flight Center Greenbelt,
ANALYSING RESEARCH – A GLOBAL PERSPECTIVE Krzysztof Szymanski – Country Manager Thomson Reuters October 2009.
The Thomson Reuters CITATION CONNECTION Digital Library st March – 3 rd April 2014, Jasná David Horký Country Manager – Central and Eastern Europe.
Research evaluation at CWTS Meaningful metrics, evaluation in context
Team Science Evaluation: Developing Methods to Measure Convergence of Fields American Evaluation Association October 27th, 2012 Unni Jensen, PhD, PMP
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
Presented by: Charles Pallandt Title: Managing Director EMEA Academic & Governmental Markets Date: April 28 th, Turkey “Driving Research Excellence.
Shou Ray Information Service Co., Ltd.
Disciplinary Diversity of Units These slides expand the article How journal rankings can suppress interdisciplinary research: A comparison between innovation.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
Office of Portfolio Analysis CSR Advisory Council October 20, 2014 George Santangelo Ian Hutchins Fai Chan Office of Portfolio Analysis (OPA) Division.
Cracking the Web of Science: International Diversity and Citation Analysis NEICON Conference St. Petersburg, Russia 26 May 2015 Christopher Burghardt Head.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Journal Status* Using the PageRank Algorithm to Rank Journals * J. Bollen, M. Rodriguez, H. Van de Sompel Scientometrics, Volume 69, n3, pp , 2006.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Using Journal Citation Reports The MyRI Project Team.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
INFORMATION SOLUTIONS Geography of Journal Publishing Fiesole Collection Development Retreat Series, Number 7 Melbourne 2005 Nancy K. Bayers, MLS Manager,
Journal Impact Factors and H index
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
Orientation to Web of Science Dr.Tariq Ashraf University of Delhi South Campus
Wojciech Fenrich Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) University of Warsaw Prague, KRE 12,
Overview of CRCHD Diversity Training Programs H. Nelson Aguila, DVM Center to Reduce Cancer Health Disparities National Cancer Institute.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
DIVISION OF LIBRARY SERVICES | OFFICE OF RESEARCH SERVICES | NATIONAL INSTITUTES OF HEALTH. Terrie Wheeler, MLS Anne White-Olson, MLS Brigit Sullivan,
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
Bibliometrics and Impact Analyses at the National Institute of Standards and Technology Stacy Bruss and Susan Makar Research Librarians SLA Pharmaceutical.
1 How to find literature - A very short introduction SMED 8004 Medicine and Health Library October 2014.
Kathleen Padova INFO 861 January 20, Emerged in different disciplines, academically Continued to develop in different disciplines in practice Information.
Beyond the RAE: New methods to assess research quality July 2008.
T H O M S O N S C I E N T I F I C Marian Hollingsworth Manager, Publisher Relations July 18, 2007 Using Metrics to Improve your Journal Veterinary Journal.
Journal Impact Factors and the Author h-index:
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Thomson Reuters Solutions for Scientific Research David Horky Country Manager – Central and Eastern Europe
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
Journal Evaluation. Impact Factor  The impact factor, often abbreviated IF, is a measure of the citations to science and social science journals. citationsscience.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Outcome Evaluation of the National Cancer Institute (NCI) Career Development (K) Awards Program Julie L. Mason, Ph.D. Center for Cancer Training National.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RESEARCH – DOING AND ANALYSING Gavin Coney Thomson Reuters May 2009.
ESSENTIAL SCIENCE INDICATORS (ESI) James Cook University Celebrating Research 9 OCTOBER 2009 Steven Werkheiser Manager, Customer Education & Training ANZ.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Karin Henning Bibliometric Services Gothenburg University Library Bibliometrics – an introduction to indicators and analyses.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
Science: A Way of Knowing Chapter 1 Great Idea: Science is a way of asking and answering questions about the physical universe.
NIH LIBRARY | OFFICE OF RESEARCH SERVICES | NATIONAL INSTITUTES OF HEALTH Support for Bibliometric/Portfolio Analysis from the NIH Library Terrie Wheeler,
A New Measure of Knowledge Diffusion Stephen Carley; Alan Porter Georgia Tech.
1 RUSSIAN SCIENCE UNDER THE MICROSCOPE Philip Purnell Moscow, October 2013.
Web of Science Demonstration Search Chemistry 137 – Spring 2013 Grace Baysinger Head Librarian & Bibliographer, Swain Chemistry & Chemical Engineering.
NCI’s Physical Sciences – Oncology Centers Program: Evaluating trans-Disciplinary Collaborations and Field Convergence November 4, 2011 Larry A. Nagahara,
JOURNAL METRICS & EVALUATION Chan Li Sr. Data Analyst California Digital Library ALA 2012 Midwinter.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
الله الرحيم بسم الرحمن علیرضا صراف شیرازی دانشیار و مدیر گروه دندانپزشکی کودکان رئیس کتابخانه مرکزی و مرکز علم سنجی دانشگاه علوم پزشکی مشهد.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Designing Evaluation for Complex Multiple-Component Programs
American Evaluation Association October 27th, 2012
Advanced Scientometrics Workshop
Sándor Soós1 1Hungarian Academy of Sciences (MTA), Budapest, Hungary
Web sites with team building resources
Isid.research.ac.ir
Presentation transcript:

Ilya Ponomarev 1, Pawel Sulima 1, Jodi Basner 1, Unni Jensen 1, Joshua Schnell 1, Karen Jo 2, and Nicole Moore 2 A New Approach for Automated Author Discipline Categorization and Evaluation of Cross-Disciplinary Collaborations for Grant programs 1 Custom Analytics, Rockville, MD 2 National Cancer Institute, Bethesda, MD 10/16/2013 5:30 PM

Why Cross-disciplinary Research? 2 “ Interdisciplinary research can be one of the most productive and inspiring of human pursuits” Facilitating Interdisciplinary Research National Academy of Sciences, 2005 Innovation increasingly occurs at the boundaries of disciplines Complex “Puzzles” require diverse background Data avalanche from multiple sources requires fusion of information Convergent technologies require integration across disciplines

US Government Funding of Cross-disciplinary R&D 3 DOD DOE NSF NIH NASA

How to Measure Success of Cross-disciplinary Program? THIS TALK: 1.In order to measure cross-disciplinarity define disciplines as accurate as possible 2.General approach of automatic assigning grant specific categories to papers and people 3.Application to NCI PS-OC grant program classification? 4 See also J. Basner, “Evaluating Collaboration and Outcomes of Health Research” Friday, 10/18/2013, 11:00am at Gunston East Rm

NCI Physical Sciences-Oncology Centers 5 12 centers, 250 Researchers 09/2009-Current InstituteFacilitateGenerate

Evaluation: Birds View 1.Use publications as a proxy of outcome : 3,367 pubs : 601 reported pubs 2.Compare baseline data set ( ) with ongoing research data set ( ) Web of Science+ Medline 166 active PS-OC investigators 202,000 references 4,199 journal titles productivity impact collaboration Fields convergence J. Basner, Friday, 10/18/2013

Evaluation: Birds View Approach: 7 Perform mapping of WoS subject categories to PS-OC categories Calculate PS-OC categories to each paper Calculate (weights of) research interests for each investigator Validate PS-OC 2/3 broad categories Oncology Physical Sciences Life Sciences

PS-OC 3 broad categories Oncology Physical Sciences Life Sciences 266 Web of Science Journal Subject Categories 8 Has Oncology SC Multiple SCs per journals (up to 7) Multidisciplinary (meaningless, but “Science”, “Nature”) Some SCs are already inter-disciplinary LSs dominates after aggregation

22 ESI Subject Categories 9 One SC per journal Does not have Oncology Multidisciplinary SC exists also Clinical medicine? LSs dominates after aggregation

Mapping. Challenges Approach: 1.Intermediate map on extended 6 Broad Categories 2.Paper level SC assignment based on references 10 PS-OC 3 broad categories Oncology Physical Sciences Life Sciences Web of Science 266 Journal SCs Web of Science 22 Broad ESI categories One SC per journal Does not have Oncology Multidisciplinary SC exists also Clinical medicine? LSs dominates after aggregation Has Oncology SC Multiple SCs per journals Multidisciplinary Some SCs are inter-disciplinary LSs dominates after aggregation

Step 1. Introduce 6 Intermediate PS-OC Categories for Better Selection: 11 PS – Physical Sciences LS – Life Sciences OC – Oncology MED – Medicine OTH – Others MULT – Multidisciplinary 11 (very often MED journals are closer to ON than LS) Will be dropped on final stage

Step 2. Map 265 WoS JSC to 6 PS-OC Categories: 12 Examples: a) Obvious: Acoustics  PS, Chemistry, Analytical  PS Oncology  OC, Management  OTH b) Dominant: Biophysics  PS c) Dominant: Physics, Multidisciplinary  PS d) Meaningless: Multidisciplinary  MULT (usually published in “Nature”, “Science” or “PNAS”) Meaningless in terms of assignment PS-OC category: article published in MULT journal can be about PS, or about LS, or OC. Usually, it is not interdisciplinary article. Additional re-classification of article’s research field is needed based on references.

Step 3. Assign PS-OC Categories Weights to Each Journal 13 (Journals in WoS can have 1 or 2, or 3, … even 7 SCs)  Examples: Journal “Radiation Research” – 3 SCs: Biology  LS Biophysics  PS Radiology, NM  PS LS PS Map Select distinct PS-OC categories 2 Count total (denominator ) Weights) LS=1/2 PS=1/2 OC=0 MED =0 MUL=0 OTH=0 Each journal should be counted equally

Step 4. Calculate combined J-R weights for publications: 14 Example: Coffey D., Getzenberg R. JAMA, 2006  1 journal cat (MED=1)  26 Refs: 14 Journal weightsAver. Refs Weights LS=0 PS=0 MED=1 OC=0 MUL=0 OTH=0 LS=0.23 PS=0.04 MED=0.17 OC=0.36 MUL=0.19 OTH=0 ½ (Journal + Refs) LS=0.12 PS=0.019 MED=0.58 OC=0.18 MUL=0.1 OTH=0 Better assignment of paper’s field based on information what paper cites

Step 5. Collect all publications for each investigator, calculate average weights, and rank PS-OC categories: 15 Example. David A  8 pubs:  Average JR weights Averaged J-R Weights LS =0.32 PS =0.04 MED=0.23 OC =0.41 OTH =0.01 Person Inter- disciplinarity LS =2 PS =4 MED=3 OC =1 OTH =5 Ranks 3

Step 6. Redistribute MED and OTH weights between OC,LS, and PS 16 LS =0.32 PS =0.04 MED=0.23 OC =0.41 OTH =0.01 LS =0.4 PS =0.05 OC =0.55

Validation 17 At the beginning of the program: Investigators self-nominated themselves as oncologists or physicists

Applications: how publication patterns change 18

Future Development 19 Physical Scientist Oncologist Life Scientist PS-OC Network InvestigatorsOutside Network Co-authors

Conclusions 20 Automated approach for decomposition of scientific publications into grant specific discipline categories Multi-step method with intermediate mapping Weighted SC assignment based on article’s and its references’ SCs Precision-recall validation based on investigators’ self- categorizations Oncologists within the NCI’s PS-OC program are publishing more physical sciences research and physical scientists are publishing more oncology or life sciences research during years of program participation.

Thomson Reuters Custom Analytics Rockville, MD

SUPPORTING SLIDES