CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge.

Slides:



Advertisements
Similar presentations
Methodologies for Assessing Social and Economic Performance in JESSICA Operations Gianni Carbonaro EIB - JESSICA and Investment Funds JESSICA Networking.
Advertisements

Charities, excellence & assessment Dr Mark Walport Director The Wellcome Trust.
Options appraisal, the business case & procurement
Project Snowball – sharing data for cross-institutional benchmarking Lisa Colledge, Anna Clements, Mhamed el Aisati, Scott Rutherford euroCRIS 2012 with.
Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director
IREG Forum on University Rankings May 2013 Dr Lisa Colledge Snowball Metrics Program Director
RIM3 Programme meeting Jan University of St Andrews Trinity College Dublin CERIF in Action Anna Clements & Scott Brander Anna Clements Enterprise.
European e-Competence Framework 1.0 A common European framework for ICT Professionals in all industry sectors
Supported self-evaluation in assessing the impact of HE Libraries Sharon Markless, King’s College London and David Streatfield, Information Management.
ARC: Open Access and Research Data Justin Withers Director, Policy and Integrity Australian Research Council.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015.
The JISC vision of research information management Dr Malcolm Read Executive Secretary, JISC.
How are people using altmetrics now? The Snowball Metrics perspective 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics Program.
Capturing the impact of research Briony Rayfield.
Snowball Metrics An institutional perspective Anna Clements 5/20/2015 Snowball Metrics Project Partners University of Oxford, University College London,
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
CERIF-CRIS Overview Keith G Jeffery
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Release 4 of the COUNTER Code of Practice for e- Resources and new usage- based measures of impact Peter Shepherd COUNTER May 2014.
C4D Workshop, Glasgow and London. July 2013 St Andrews : RDM and Pure CERIF 4 Datasets Workshop, Glasgow and London. July 2013 Anna Clements, Enterprise.
The emerging role of Institutional CRIS in facilitating Open Scholarship Anna Clements, Assistant Director (Digital Research) Jackie Proven, Repository.
Snowball Metrics A standard for research benchmarking between institutions 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics.
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
CRISs, CERIF, CASRAI and Snowball Metrics (Why) are these key to University Libraries? Anna Clements, Assistant Director (Digital Research), University.
Chapter 8 Architecture Analysis. 8 – Architecture Analysis 8.1 Analysis Techniques 8.2 Quantitative Analysis  Performance Views  Performance.
Training Seminar The Professional Association of Research Managers and Administrators Improved Research Information and Decision Management for Strategic.
Lecture Four: Steps 3 and 4 INST 250/4.  Does one look for facts, or opinions, or both when conducting a literature search?  What is the difference.
euroCRIS Members Meeting, University of Porto – November 15, 2013 Brigitte Joerg PhD, euroCRIS, JeiBee Ltd. UKRISS Core Information Reporting.
University of Glamorgan Faculty of Business & Society FGM Development Day Wednesday 18 th July 2012 The UK Quality Code for Higher Education A Brief Guide.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
COUNTER and the development of standards for usage reports Marthyn Borghuis, Elsevier COUNTER Executive Committee For: CALISE-Taiwan.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
University of St Andrews euroCRIS Strategic Seminar, September 12 th -13 th, 2011, BrusselsAnna Clements, University of St Andrews Anna Clements Enterprise.
Current and Future Applications of the Generic Statistical Business Process Model at Statistics Canada Laurie Reedman and Claude Julien May 5, 2010.
Eurostat Expression language (EL) in Eurostat SDMX - TWG Luxembourg, 5 Jun 2013 Adam Wroński.
Data Management and Accessibility S.M. Kaye PPPL Research Seminar 12/16/2013.
CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme.
Modelling national research assessments in CERIF Stephen Grace & Richard Gartner Centre for e-Research.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
The School’s Research Environment Anne Mills Deputy Director and Provost Improving health worldwide
EQF Facts and Questions. Conclusions The EQF has become a driver for national reform! A momentum has been created The EQF has become a driver for national.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Rural Development Programme BDGP Monitoring Group 27 November 2015.
CASRAI Consortia Advancing Standards in Research Administration Information David Baker, Executive Director.
Title of the presentation | Date |1 Nikos Houssos National Documentation Centre (EKT/NHRF) CRIS for research information management.
DOE Data Management Plan Requirements
A centre of expertise in digital information management UKOLN is supported by: Functional Requirements Eprints Application Profile Working.
Standardising vocabularies and use cases in Research Information: the CASRAI Profiles euroCRIS Strategic Membership Meeting, Barcelona, Nov 2015.
An adoption phase for RDA WGs?. Background WGs end after 18 months WGs (and some IGs) produce outputs, but adoption of these outputs often only takes.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 Click to edit Master title style What is Business Analysis Body of Knowledge?
Investors Working Group – Sharing the lessons from Pilot- testing of SPI4 9 th SPTF annual meeting Dakar June 3, 2014.
Eurostat Sharing data validation services Item 5.1 of the agenda.
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Bibliometrics at the University of Glasgow Susan Ashworth.
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
QuicK Reference Cards for Research Impact Metrics.
PIRUS PIRUS -Publisher and Institutional Repository Usage Statistics
Christina Lohr Product Manager Research Metrics, Elsevier
Phil Quirke RAE 2008 & REF 2014 panels
Programme Board 6th Meeting May 2017 Craig Larlee
CERIF in Action Anna Clements
Snowball Metrics – providing a robust methodology to inform research strategy – but do they help? John Green, Chair of Snowball Steering Group formerly.
Introduction to the PRISM Framework
COUNTER Update February 2006.
Comparing your papers to the rest of the world
Presentation transcript:

CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge

euroCRIS and Snowball Metrics Indicators Task Group aims to develop an active programme on the use of indicators for evaluating research One of multiple sets of indicators that will be handled is the Snowball Metrics The methods have already been defined, tested, and published, and the aim is now to translate them into CERIF to facilitate their adoption in multiple systems and to enable institutions to benchmark their performance regardless of the systems they have implemented

Principles of CERIFication of Snowball Metrics Be generic to ensure that the CERIFication is applicable globally (not only UK) Indicator is defined at policy level (Snowball Metrics project partners) When transferring metrics between systems, only the final number will be transferred – e.g. when normalising by FTE, we will enable transfer of the calculated value, and not send the metric plus the FTE count and expect the recipient to complete this normalisation themselves We describe the data source, but not the calculation that a specific system uses to generate the metric, so that we follow the principle of Snowball Metrics being system-agnostic

Expressing Snowball Metrics in CERIF requires 2 areas of focus A Snowball Metric Calculating CERIF is used to explain how the metric is calculated by combining data elements in a particular manner – “how do I reach the value 7?” Understanding CERIF is used to communicate what a value represents so that a tool receiving information can understand what it has – “what does this value 7 represent?” The focus of this deck

Hierarchy Input Metrics Process Metrics Output Metrics Applications Volume Awards Volume Income Volume Market Share Scholarly Output Citation Count h-index Field-Weighted Citation Impact Outputs in Top Percentiles Collaboration Snowball Metrics *Indicator schemes cover multiple options for metric set up Collaboration has a limited 2 options Market Share and Outputs in Top Percentiles have unlimited options; we suggest listing them out in additional hierarchies associated with these 2 metrics

Generic layout and translation to structure used by Snowball Metrics Recipe Book # Definition(s) cf Indicator y-axis view cf Measurement Class x-axis view Denominator cf Orgunit Data source Date measurement was created #

Applications Volume Applications Volume description cf Indicator Count of Applications (cf Count) Price of Applications (cf Float) Count per FTE (cf Count) Price per FTE (cf Float) cf Measurement Class Institution Discipline Funder type cf Orgunit Measurement period Service Date measurement was created Date of data currency

Awards Volume Awards Volume description cf Indicator Count of Awards (cf Count) Value of Awards (cf Float) Count per FTE (cf Count) Value per FTE (cf Float) cf Measurement Class Institution Discipline Funder type cf Orgunit Measurement period Service Date measurement was created Date of data currency

Income Volume Income Volume description cf Indicator Income Spent (cf Float) Income spent per FTE (cf Float) cf Measurement Class Institution Discipline Funder type cf Orgunit Measurement period Service Date measurement was created Date of data currency

Market Share scheme Input Metrics Process Metrics Output Metrics Market Share Snowball Metrics Share of UK Share of England Share of Pure user group Share of Russell group Etc.

Market Share Market Share description cf Indicator Percentage of total research income (cf Integer) cf Measurement Class Institution Discipline Funder type cf Orgunit Measurement period Service Date measurement was created Outputs in Top 1% description Share of UK description Date of data currency

Scholarly Output Scholarly Output description cf Indicator Number of outputs (cf Count) Number of outputs per FTE (cf Count) cf Measurement Class Institution Discipline cf Orgunit Measurement period Service Date measurement was created Date of data currency

Citation Count Citation Count description cf Indicator Number of citations (cf Count) Number of citations per FTE (cf Float) Number of citations per output (cf Count) cf Measurement Class Institution Discipline cf Orgunit Measurement period Service Date measurement was created Date of data currency

h-index h-index description cf Indicator h-index value (cf Integer) cf Measurement Class Discipline cf Orgunit Measurement period Service Date measurement was created Date of data currency

Field-Weighted Citation Impact FWCI description cf Indicator FWCI value (cf Integer) cf Measurement Class Institution Discipline cf Orgunit Measurement period Service Date measurement was created Date of data currency

Outputs in Top Percentiles scheme Input Metrics Process Metrics Output Metrics Outputs in Top Percentiles Snowball Metrics Outputs in Top 1% world Outputs in Top 5% world Outputs in Top 10% world Outputs in Top 25% world Outputs in Top 1% China Etc.

Outputs in Top 1% description Outputs in Top Percentiles Outputs in Top 1% description cf Indicator Number of Outputs (cf Count) Percentage of Outputs in that denominator (cf Integer) Number of Outputs per FTE (cf Count) cf Measurement Class Institution Discipline cf Orgunit Measurement period Service Date measurement was created Outputs in Top 1% description Date of data currency

Collaboration scheme Input Metrics Process Metrics Output Metrics Collaboration Snowball Metrics International National

Collaboration - national Collaboration Collaboration international cf Indicator cf Measurement Class Institution Discipline cf Orgunit Measurement period Service Date measurement was created Number of Outputs (cf Count) Percentage of Outputs in that denominator (cf Integer) Number of Outputs per FTE (cf Count) Date of data currency

Next steps Indicators Task Group validates these “pictures” Express Snowball Metrics in: – Formal CERIF – Xml example Publish in new version of recipe book Consider enhancing CERIF model with: – Currencies of the data source – Standard time periods – National currencies – GBP / Yen / USD / etc