Presentation is loading. Please wait.

Presentation is loading. Please wait.

OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt.

Similar presentations


Presentation on theme: "OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt."— Presentation transcript:

1 OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt University of Alberta Libraries Netspeed 2014

2 WELCOME ● Who we are ● What we’ll cover o What is the broad context for research performance metrics need in post-secondary? o What’s are some issues with research performance metrics? o How can we improve research performance assessment?

3 WHAT IS THIS ALL ABOUT? A brief history of research evaluation

4 FUNDAMENTAL PRINCIPLES OF RESEARCH METRICS: ● Single indicators are misleading on their own ● Integration of both qualitative & quantitative data is necessary ● Various frameworks for research performance already exist

5 ● Times cited ≠ Quality ● Discipline to discipline comparison is inappropriate ● Citation databases cover certain disciplines better than others FUNDAMENTAL PRINCIPLES OF RESEARCH METRICS:

6 BROAD/GLOBAL CONTEXT ● Recent upswing in interest about research evaluation ● Nationwide assessments in UK, Australia, New Zealand ● An audit culture is growing

7 METRICS WANTED For what? Performance (strength and weakness) Comparison (with other institutions) Collaboration (potential and existing) Traditional metrics Altmetrics

8 WHO’S IN THE GAME: CONSUMERS ● Senior university administrators ● Funding agencies ● Institutional funders ● Researchers ● Librarians

9 WHO’S IN THE GAME: PRODUCERS (VENDORS) ● Elsevier: Scopus and SciVal ● Thomson Reuters: Web of Science and InCites ● Digital Science: Symplectic Elements ● Article-level metrics (altmetrics) solutions

10 VENDOR CLAIMS ● Quick, easy, and meaningful benchmarking ● Ability to devise optimal plans ● Flexibility ● Insightful analysis to identify unknown areas of research excellence …...All with a push of a button!

11 WHAT DO WE FIND WHEN WE TEST THESE CLAIMS?

12 WHAT’S NEEDED: PERSISTENT IDENTIFIERS ● Without DOIs, how can impact be tracked? ● ISBNs, repository handles ● Disciplinary and geographic differences in DOI coverage: DOI assignment costs $$ ● What about grey literature? ● Altmetrics still may depend on DOI

13 WHAT’S NEEDED: NAME DISAMBIGUATION (THE BIGGEST PROBLEM)

14 WHAT’S NEEDED: SOURCE COVERAGE ● Source coverage in most prominent products are still Scopus and Web of Science (STEM-heavy) ● Integration of broader sources is packaged with more expensive implementations ● Some products specifically market broad source coverage (Symplectic Elements)

15 WHAT’S NEEDED: NATIONAL SUBJECT AREA CLASSIFICATION (TO A FINE LEVEL) ● Subject classification in products is EXTREMELY broad - so broad, comparisons are inappropriate ● Integration of a national standard of granular subject classification would help everyone

16 SUBJECT CLASSIFICATION EXAMPLE http://www.rcuk.ac.uk/RCUK- prod/assets/documents/documents/ResearchAreasProposalClassificationsList.pdf

17 WHAT’S NEEDED: TRAINING & KNOWLEDGE ● Do all CONSUMERS want/need training? ● Have we analyzed our services for citation impact and metrics analysis? ● Top-to-bottom organizational training couched in strategic needs for metrics identified

18 WHAT’S NEEDED: PROCESSES & WORKFLOWS ● Data cleaning ● Verification of new data ● Running analysis ● Verifying analysis

19 WHAT’S NEEDED: CULTURAL UNDERSTANDING ● How is the data going to be used? And who will be rewarded? ● An audit culture ● Social sciences, humanities, arts would have justified concerns with the adoption of tools that are citation based

20 HOW CAN ACADEMIC LIBRARIES HELP? ● Share our knowledge of best practices/other effective implementations ● Challenge vendors to address problems ● Train for author ID systems and assignment and integrate author IDs with digital initiatives

21 HOW CAN ACADEMIC LIBRARIES HELP? ● Advocate for national comparison standards (CASRAI) ● Employ our subject-focused outreach model ● As a central unit, make broad organizational connections to help with implementation ● Promote our expertise: bibliographic analysis is an LIS domain

22 RECOMMENDATIONS ● Strategic leaders need to initiate university wide conversations about what research evaluation means for the institution ● Tools need to be flexible to incorporate non-Journal based scholarly work/data ● New workflows need to be minimized and incorporated into existing workflows as much as possible ● Broad adoption of ORCID ID system

23 REFERENCES Marjanovic, S., Hanney, S., & Wooding, S. (2009). A Historical Reflection on Research Evaluation Studies, Their Recurrent Themes and Challenges. Technical Report. RAND Corporation. Moed, H.F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.


Download ppt "OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt."

Similar presentations


Ads by Google