Presentation is loading. Please wait.

Presentation is loading. Please wait.

Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March.

Similar presentations


Presentation on theme: "Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March."— Presentation transcript:

1 Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March 24, 2008

2 Why citation analysis? Study the evolution of scientific disciplines Examine and/or map the social, economic, political, and intellectual impact of scientific research Assist in certain decisions (promotion, tenure, hiring, grants, collaboration, etc.)

3 Research problem Until today, most citation-based research rely exclusively on data obtained from the Web of Science database Emergence of Scopus and Google Scholar has raised many questions regarding the use of Web of Science exclusively

4 Literature review The question of whether to use Scopus and/or Web of Science as part of a mapping or research assessment exercise might be domain-dependent and that more in-depth studies are needed to verify the strengths and limitations of each source Scopus covers 84% of all journal titles indexed in Web of Science; Web of Science covers 54% of all journal titles indexed in Scopus

5 Research questions How do the two databases compare in their coverage of HCI literature and the literature that cites it, and what are the reasons for the differences? What impact do the differences in coverage between the two databases have on the citation counting, citation ranking, and h-index scores of HCI researchers? Should one or both databases be used for determining the citation counting, citation ranking, and h-index scores of HCI researchers?

6 Significance/value of study Determine whether citation searching in HCI should be extended to both Scopus and Web of Science or limited to one of them. Will help people who use citation analysis for research evaluation and mapping exercises justify their choice of database

7 Databases Web of Science – Approximately 9,000 journals going back to 1955 – Books in series and an unknown number of conf. proceedings, including LNCS, LNAI, LNM Scopus – 14,000 journals going back to 1996 for citations – 500 conference proceedings – 600 trade publications

8 Methods Sample – 22 top HCI researchers from the Equator Interdisciplinary Research Collaboration, a project funded by UK’s Engineering and Physical Sciences Research Council (six years) Publications (n=1,440, mainly conf papers and journal articles) – 594 (41%) were covered by Scopus – 296 (21%) were covered by Web of Science – 647 (45%) were covered by both

9 Methods, cont’d Searching methods used to identify citations to the 1,440 items published/produced by the sample members: – Scopus: (1) exact match of each item in “References” field; (2) “More” tab; and (3) “Author” search results + “Cited by” – WoS: cited references search Citation information was parsed by author, publication type, year, source name, institution, country, and language Source names were manually standardized and missing institutional affiliation and country information (3%) was gleaned from the web

10 Methods, cont’d Data from both databases were cross- examined for accuracy h-index – Definition, strengths, and limitations – System-based counting method (takes into account only indexed works, n=647 works) – Manual-based counting method (takes into account all 1,440 works published/produced by sample)

11 11 Results: Distribution of unique and overlapping citations Scopus n=6,919 (93%) Web of Science n=4,011 (54%) 3,491 (47%) 520 (7%) 3,428 (46% ) WoS  Scopus = 7,439* *Excludes 255 citations from WoS, published before 1996

12 Results: Reasons for the significant differences Note: 76% of all citations found in conference proceedings were unique to a single database, in comparison to 34% in the case of citations in journals

13 Results: Quality of Scopus unique citing journals RankSources of citationsWoSScopusUnion Scopus IF (rank) JCR Impact Factor 1Presence: Teleoperators and Virtual Environments1561551591.480 (10)1.000 2International Journal of Human-Computer Studies1301231311.615 (7)1.094 3Interacting with Computers1131051151.140 (17)0.833 4Computer Supported Cooperative Work-91 2.000 (4)NA 5TCyberpsychology & Behavior53 601.269 (13)1.061 5TIEEE Pervasive Computing5048602.971 (2)2.062 5TPersonal and Ubiquitous Computing*4857601.427 (12)NA 8Behaviour & Information Technology5758591.097 (19)0.743 9J. of the Am. Soc. for Info. Sci. and Tech.5346551.766 (6)1.555 10Human-Computer Interaction4441463.043 (1)2.391 19T ACM Transactions on Computer-Human Interaction -23 2.861 (3)NA 19TNew Review of Hypermedia and Multimedia-23 0.565 (22)NA This is a partial list of the top 20 citing journals

14 Results: Quality of Scopus’s citing conference proceedings (top 9 citing titles) RankSources of citations WoS citations Scopus citations Union citations IF* (rank) 1 ACM Conference on Human Factors in Computing Systems Not indexed 211 2.478 (1) 2 ACM Conference on Computer Supported Cooperative Work Not indexed 72 - 3Ubicomp: Ubiquitous Computing, Proceedings (LNCS)675569- 4 IEEE International Conference on Pervasive Computing and Communications, PerCom Not indexed 64 0.934 (3) 5 Proceedings of SPIE - The International Society for Optical Engineering Not indexed 60 - 6 ACM Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI (LNCS) 454858- 7IEEE Virtual Reality Conference Not indexed 51 0.612 (5) 8ACM Conference on Hypertext and Hypermedia** Not indexed 49 0.915 (4) 9ACM Conference on Designing Interactive Systems, DIS Not indexed 45 - *Source: Scopus.

15 Differences in citation counting and ranking of individual researchers (top 12) NameWeb of ScienceScopusDifferenceUnion of Web of Science and Scopus CountRankingCountRankingCount (%)RankingCountRanking Rogers*75311,2291476 (63%)01,3191 Benford*57231,1792607 (106%)11,2442 Rodden*57721,0753498 (86%)1,1383 De Roure*42157644343 (81%)18344 Gaver*42747045277 (65%)7535 Friday*34886496301 (86%)26776 Schmidt32996077278 (84%)26547 Gellersen*311105918280 (90%)26278 Cheverst35275869234 (66%)-26189 Steed*354658410230 (65%)-461510 Chalmers*2561141411158 (62%)044211 Crabtree1361432612190 (140%)233412 TOTAL4,0116,9192,908 (73%)7,439

16 Differences in mapping scholarly impact of individual researchers: an example ResearcherWeb of ScienceScopus% Mismatch Top Citing Authors BenfordPilar Herrero (10) Chris Greenhalgh (6) Ling Chen (5) Jin Zhang (5) Paul Luff (4) Minh Hong Tran (4) Pilar Herrero (13) Ling Chen (10) Andy Crabtree (10) Azzedine Boukerche (8) Carl Gutwin (7) 64% Top Citing Sources BenfordPresence: Teleoperators and Virtual Environments (57) UbiComp (21) International Journal of Human- Computer Studies (15) Interacting with Computers (14) Personal and Ubiquitous Computing (11) CHI Conference (58) Presence: Teleoperators and Virtual Environments (57) Int. Conf. on Collaborative Virtual Environments (32) Computer Supported Cooperative Work (31) IEEE Virtual Reality Conference (22) 80%

17 Differences in mapping scholarly impact of individual researchers, cont’d ResearcherWeb of ScienceScopus% Mismatch Top Citing Institutions* BenfordUniversity of Nottingham (33) University of Sussex (14) Lancaster University (11) Universidad Politécnica de Madrid (10) King's College London (8) University of Nottingham (80) University of Ottawa (23) University College London (21) Zhejiang University (19) Fraunhofer-Gesellschaft (16) Georgia Institute of Technology (16) Lancaster University (16) 67% Top Citing Countries BenfordUnited Kingdom (158) United States (127) Germany (30) Japan (28) Australia (25) United Kingdom (312) United States (234) China (69) Japan (65) Canada (52) 40% *Percentage of mismatch would have been higher had we removed citations from the home institution of the researcher

18 Differences in average h-index

19 Difference in h-index of individual researchers Web of ScienceScopusUnion of WoS and Scopus Difference System countManual countSystem countManual countSystem countManual count Benford*71412221224100% Rodden*5131219122175% Gaver*3148208 150% De Roure*612817919111% Rogers*71191591789% Steed*6111016101660% Gellersen*681014101550% Schmidt5991491567% Chalmers*278138 63% Cheverst5971271386% AVERAGE3.58.06.812.36.913.089% This is a partial list of the top 10 researchers

20 Comparison of h-index between GS and WoS+Scopus ResearcherUnion of WoS and ScopusGoogle ScholarDifference ScoreRankScoreRank Benford*241381T58% Rodden*212381T81% Gaver*20332360% De Roure*194274T42% Rogers*175274T59% Cheverst139T256T92% Gellersen*157T256T67% Steed*166256T56% Schmidt157T24960% Friday*139T231077% Chalmers*139T211162% Crabtree139T201254% Brown1014T181380% Fitzpatrick*1014T171470% Muller*917T1515T67% Stanton-Fraser11131515T36% Weal1014T141740% Randell917T131844% Izadi819121950% Schnädelbach7209 29% Barkhuus621T8 33% Price621T8 33% AVERAGE13.020.659%

21 Conclusions and implications In HCI, conference proceedings constitute a major channel of written communication Most of these proceedings are published by ACM and IEEE and also by Springer in the form of LNCS and LNAI Scopus should be used instead of WoS for citation-based research and evaluation in HCI

22 Conclusions and implications, cont’d h-index should be manually calculated rather than relying on system-generated scores Researchers can no longer limit themselves to WoS just because they are familiar with it, have access to it, or because it is the more established data source A challenge is to systematically explore citation data sources to determine which one(s) are better for what research domains

23 Conclusions and implications, cont’d Principles of good bibliometrics research: – Analysis should be applied only by professional people with theoretical understanding and thorough technical knowledge of the databases, retrieval languages, and the abbreviations, concepts, and/or terminologies of the domain under investigation – Analysis should only be used in accordance with the established principles of “best practice” of professional bibliometrics – If utilized for research assessment purposes, citation-based information should only be used in conjunction with qualitative peer review-based information

24 March 24, 2008 Network and Complex Systems 24 Thank You Questions? meho@indiana.edu Full paper available at: http://www.slis.indiana.edu/faculty/meho/meho-rogers.pdf


Download ppt "Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March."

Similar presentations


Ads by Google