Other responses: Librarian 3 Data Scientist/data manager/data analyst 7 Student/assistant 2 Writer/Editor/publications support 3 Programme Manager 1 Computer.

Slides:



Advertisements
Similar presentations
Use your bean. Count it. Thomas Krichel
Advertisements

OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
1 Quality Control in Scholarly Publishing. What are the Alternatives to Peer Review? William Y. Arms Cornell University.
Data citation from the perspective of a scholarly publisher Lyubomir Penev TDWG Data Citation Workshop, New Orleans, Oct 2011 ViBRANT.
Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin.
IDENTIFIERS & THE DATA CITATION INDEX DISCOVERY, ACCESS, AND CITATION OF PUBLISHED RESEARCH DATA NIGEL ROBINSON 17 OCTOBER 2013.
6th Conference on Open Access Scholarly Publishing September 17th – 19th, 2014 Kevin Dolby Wellcome Trust OA Publishing Community Standards: Article Level.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Altmetrics and impact Altmetric.com Euan AdieCOPE, 17 th April 2015.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Publishing strategies – A seminar about the scientific publishing landscape Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE.
Institutional Perspective on Credit Systems for Research Data MacKenzie Smith Research Director, MIT Libraries.
THE DATA CITATION INDEX AN INNOVATIVE SOLUTION TO EASE THE DISCOVERY, USE AND ATTRIBUTION OF RESEARCH DATA MEGAN FORCE 22 FEBRUARY 2014.
Section 2: Science as a Process
Providing Access to Your Data: Tracking Data Usage Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
Making an impact ANU Library What is impact What (the heck) are bibliometrics Publish with impact – an overview Debate on impact How innovative are you.
Helping researchers maximize reach and impact of their work Whose work is it anyway? Melinda Kenneway Kudos.
“Knowing Revisited” And that’s how we can move toward really knowing something: Richard Feynman on the Scientific Method.
DAEDALUS Project William J Nixon Service Development Susan Ashworth Advocacy.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
NFAIS 2013 Caitlin Aptowicz Trasande, PhD Head of Research Policy | Digital Science
Group 1 Case Study Presentation Proposal for Open Access (OA) Library Leadership Institute 2014.
Providing Access to Your Data: Tracking Data Usage Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt.
We are the 92% Valuing the contribution of research software Neil Chue Hong, FORCE2015 Research Communications and e-Scholarship.
Panel Discussion Part I Methodology Ideas from adult MR brain segmentation are used in neonatal MR brain segmentation. However, additional challenges.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Background Briefings for Librarians #1: Impact Factor Paola Gargiluo speaking with Dr. David F. Kohl CASPUR Podcast Series May 28, 2008.
1 Why should “WE” CARE about data?. International initiatives OECD principles and guidelines for access to research data from public funding 2007 “Access.
Publishing Trends: Open the University of Florida Presentation to IDS 3931: Discovering Research and Communicating Science October 21, 2010.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
Peer review of digital resources for the arts and humanities David Bates and Jane Winters.
Where are the rewards? Building a culture of data citation workshop Edith Cowan University, Perth March
How to use Bibliometrics in your Career The MyRI Project Team.
Be a Leader and Champion of Open Access on Campus to provide wider and easier access to library contents and research output of faculty members, with a.
DOAJ Directory of Open Access Journals Berlin March 2006.
Non-scholarly publications and possibilities of measuring social impact: case of Slovenian Forestry, Wood Technology and Paper Technology Maja Peteh, Polona.
CITATION ANALYSIS A Tool for Collection Development and Enhanced Liaison Services Christine Brown and Denis Lacroix.
AltMetrics Bruce Antelman ThinkLoud thinkloud.com Capturing the Buzz.
MEDIN Work Plan for By March 2011 MEDIN will be 3 years into the original 5 year development plan started in Would normally ask for continued.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
1. 2 Rewards are real … but few (yet) 3 The citation benefit intensified over time... ...with publications from 2004 and 2005 cited 30 per cent more.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
RESEARCH DATA ALLIANCE BIBLIOMETRICS FOR DATA SURVEY RESULTS.
SET ACCESS TO OPEN - MENDELEY Jose Luis Andrade – President, The Americas Sujay Darji – Regional Sales Manager October 22, 2012.
Carla Basili - Luisa De Biagi Carla Basili * - Luisa De Biagi * * IRCrES Institute, Rome (IT) *CNR –IRCrES Institute, Rome (IT) Central Library ‘G. Marconi’,
CNR – National Research Council, Rome (IT) Central Library ‘G. Marconi’ National Centre for Grey Literature and National ISSN Centre CNR – National Centre.
Providing access to your data: Determining your audience Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
RDA-WDS Publishing Data IG Data Bibliometrics Working Group.
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
Measuring Your Research Impact Citation and Altmetrics Tools University Libraries Search Savvy Seminar Series April 9 & 10, 2014 Prof. Amanda Izenstark.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
Merit JISC Collections Merit: presentation for UKCORR Hugh Look, Project Director.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Research Methods in Business and Economics3 Jan Brzozowski, PhD.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Metrics What they are and how to use them
Section 2: Science as a Process
Five years of helping chemists to create an online presence using freely available resources Antony Williams National.
Post-publication evaluation through tags
Literature review Lit. review is an account of what has been published on a topic by accredited scholars and researchers. Mostly it is part of a thesis.
Research impact and library support
Towards Excellence in Research: Achievements and Visions of
Measuring Your Research Impact
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Dr John Corbett USP-CAPES International Fellow
Presentation transcript:

Other responses: Librarian 3 Data Scientist/data manager/data analyst 7 Student/assistant 2 Writer/Editor/publications support 3 Programme Manager 1 Computer Scientist 1 Other responses: 30 responses – mostly different Measure organisational impact Encourage data openness/publication (and provide benefit for data producers) Knowledge of who uses data and why – imrpove access for users - inform data retention decisions – improve user experience – justify repository investment – discover impact of data centre policies Evaluate employees performance/institutional requirement Sarah Callaghan 1, Todd Carpenter, 2 John Kratz 3 1 British Atmospheric Data Centre, STFC Rutherford Appleton Laboratory, Harwell Oxford, Didcot, OX12 7DQ, UK. 2 National Information Standards Organization (NISO). 3 California Digital Library Abstract Researchers want to know how their work impacts their communities, and the wider world; including research outputs other than peer-reviewed journal publications. The journal paper provides a way of claiming and defining an area of intellectual work, and citation of articles allows the acknowledgement of that work by others. Yet the paper can only give an overview of the work - it is not possible to publish everything into a paper that is needed to make it fully reproducible. For providing credit (and for making recruitment and promotion decisions) we abstract the paper further. Instead of reading every citing paper, we instead count the citations, reckoning this an appropriate proxy for the quality of the paper, and hence the described work. Citation counts for datasets are one of the “carrots” promised to researchers for their efforts in citing and publishing data, also producing a metric by which the quality of a dataset can be evaluated. Quality is a slippery concept when it comes to data, which can be good quality for one purpose, and bad for another. Measuring the impact of research directly is difficult, so we resort to measuring what we can (number of citations). Care must be taken with indirect measurements to ensure that they map appropriately to what we really want to measure. Survey results This survey was carried out by the RDA/WDS Publishing Data Bibliometrics Working Group. It aimed to ask interested parties what they currently used to evaluate the impact of data, and what they would like to use in the future. The survey was carried out via a web-based survey system (SurveyMonkey) and invitations to participate were distributed widely, mainly through mailing lists for interested parties. There were 115 respondents. 1)Who Responded? 2)What is your field of specialty? 3)How long have you been working in this field? 4)What do you currently use to evaluate the impact of data? 5)What do you currently use to evaluate the impact of data? 33 responses: Majority opinion: current metrics not good enough, no standards, don’t know what to do Other opinions: Impact metrics not important for respondee Interest in quantifying impact, but repository/policies still under development Metrics are too easily gamed, or too complicated 6)Are the methods you use to evaluate impact adequate for your needs? If not, why not? 7)Why do you want to evaluate the impact of data? 8)In the future, what would you like to use to evaluate the impact of data? CITATIONS! (and better tools to track them) Downloads Altmetrics/“anything and everything” Peer review/community feedback Use outside scholarly literature (e.g., in patents) Reuse/“actual use” 9)What is currently missing and/or needs to be created for bibliometrics for data to become widely used? STANDARDS! Data Citation Consistent use of PIDs / DOIs Culture Change / “A belief that they are valid.” 10) What difference would it make to your work to be able to evaluate the impact of data? 91 responses. Common themes: Promote data sharing/publication/reuse/data stewardship – provide credit for data producers Justify funding for data activities Other criteria for evaluation of research impact Inform and prioritise data access systems – improve services Influence in public-policy decisions making 11) Do you know of any tools from other research areas that evaluate data impact? Thomson Reuters DCI/Web of Science “indicator programs in the UN” “tools most likely lie in the field of economics and quantitative analysis (the value of decision information)” “The numerical weather prediction community uses Observing System Simulation Experiments (OSSEs) to evaluate data impact on prediction accuracy.” Further Work Obviously, there is a lot of interest in this area, and for bibliometrics for data to be accepted and used widely, standards and tools must be designed to work for the community. Concerns were raised about the possibility of “gaming” metrics, and that solutions adopted should be open and free to use. If you are interested in contributing to this work, please join the RDA/WDS Publishing Data Bibliometrics Working Group: wg.htmlhttps://rd-alliance.org/groups/rdawds-publishing-data-bibliometrics- wg.html Walk softly and carry a large carrot: how to give credit for academic work RDA/WDS Publishing Data Bibliometrics WG Other responses: “Impact factor of journal in which data has been published” / “journal articles” “figshare stats”/“ResearchGate data” “Mostly still citations but we are starting to use e.g. altmetrics and twitter reach/impact scores”/ “altmetrics impact study being formulated for 2015” “rich user registration meta data and their access activity” “Talk to scholars”/“ad hoc feedback”/ “informal assessment” 75 responses to “if not, why not?”: Lack of tools and standards Limited data citation Available measures not good enough. Too difficult/time consuming Need to focus on other (non-scientific) impacts (e.g. planning/educational use of data) Hoping increased use of DOIs will help situation