Download presentation
Presentation is loading. Please wait.
Published byBrittney Burson Modified over 10 years ago
1
Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com l.colledge@elsevier.com Snowball Metrics 1
2
Snowball Metrics are… Endorsed by a group of distinguished UK universities to support their strategic decision making Tried and tested methodologies that are available free-of-charge to the higher education sector Absolutely clear, unambiguous definitions enable apples-to-apples comparisons so universities can benchmark against their peers to judge the excellence of their performance Snowball Metrics are unique because: Universities drive this bottom up Academia – industry collaboration 2
3
Snowball Metrics address shared needs Growing recognition of the value of data/metrics to inform and monitor research strategies Dissatisfaction with available tools: bespoke implementations, incompatibility of systems Frustration over the lack of a manageable set of standard metrics for sensible measurements Frequent similar data requests from external bodies looking at aspects of performance that are not necessarily of most interest to universities themselves Background An agreed national framework for data and metrics standards is needed, and suppliers should participate in the development of these standards Universities need to benchmark to know their position relative to their peers, so they can strategically align resources to their strengths and weaknesses Universities and funders should work more collaboratively, and develop stronger relationships with suppliers Recommendations from the study Imperial College London and Elsevier conducted a joint study of English research information management funded by JISC
4
The REF alone is not a suitable tool for a university 4 CURRENT SITUATION REF/RAE provides a snapshot every 5-6 years Focused approach to measuring outputs and impacts Strategic allocation of researchers Changing methodologies REF/RAE provides a snapshot every 5-6 years Focused approach to measuring outputs and impacts Strategic allocation of researchers Changing methodologies DESIRED SITUATION Current snapshots, at least every year Broad range of measures across research and enterprise Comparable allocation of researchers between universities Stable approach Current snapshots, at least every year Broad range of measures across research and enterprise Comparable allocation of researchers between universities Stable approach
5
Desired situation = vision for SM Snowball Metrics drive quality and efficiency across higher educations research and enterprise activities, regardless of system and supplier, since they Are the preferred standards used by research-intensive universities to view their own performance within a global context Encompass the scope of key research and enterprise activities of a research-intensive university 5 Snowball Metrics Project Partners
6
Main roles and responsibilities Everyone is responsible for covering their own costs University project partners – Agree the metrics to be endorsed by Snowball – Determine feasible methodologies to generate the metrics in a commonly understood manner Elsevier – Ensure that the methodologies are feasible, prior to publication of the recipes, by building and hosting the Snowball Metrics Lab as a test environment – Distribute the recipes using our communications networks – Day-to-day project management of the global program Outside the remit of the Snowball Metrics program – Nature and quality of data sources used to generate Snowball Metrics – Provision of tools to enable the global sector to generate and use Snowball Metrics 6
7
Snowball Metrics Recipe Book 7 Agreed and tested methodologies for new Snowball Metrics, and versions of existing Snowball Metrics, are and will continue to be shared free-of-charge. None of the project partners will at any stage apply any charges for the methodologies. Any organisation can use these methodologies for their own purposes, public service or commercial. (Extracts from Statement of intent, October 2012) Elseviers approach Any organisation can use the recipes to prepare the metrics in their own kitchen from their own ingredients free of charge. If an organisation approaches Elsevier for help to implement and use the metrics, we will charge to eat at our restaurant
8
The Lab tests metrics feasibility 8
9
Metrics can be size-normalised 9
10
Metrics can be sliced and diced 10
11
Viewing options… 11 Chart / table
12
Testing addressed feasibility issues Wide range of metrics Data availability across landscape Sensitivity of inputting data into a shared system Researcher-level data (Data Protection Act) Manual labour in data collection Experts group formed to select and define phase 1 metrics – impactful, do-able, require data from 3 sources Data sharing agreement Unlocking model in the SM Lab Share metrics not data Used only where needed Not revealed in metric granularity University, proprietary and third party data used in as close to native format as possible
13
Snowball Metrics are feasible Feasibility means that they are S(S)MART: – Specific - not open to interpretation – Scalable – can be generated across a whole university – Manageable – data can be collected in acceptable amount of time – Agreed – project partners have agreed both metric and methodology – Realistic – can be generated by multiple universities despite distinct systems – Time-bound – can be updated regularly to ensure information currency 13
14
Metrics for 2013 Aim is to publish Recipe Book v2 by end 2013 It is anticipated that this will add to v1 by including: – New group 2 recipes covering additional areas of Snowball Metrics Landscape… 14
15
Snowball Metrics landscape Research InputsResearch Processes Research Outcomes Research Post-Graduate Education Enterprise Activities Research applications Research awards Research income Publications & citations Collaboration (co- authorship) Impact / Esteem Post-graduate research Post-graduate experience Industrial income and engagement Contract turnaround times Industry research income Patenting Licensing income Spin-out generation / income Completion rates People Organisations Themes / Schemes Researchers Role Institution Institutional unit External groupings Funder type Award type Subject area / keywords Denominators Slice and dice Normalise for size Numerators Denom.
16
Metrics for 2013 Aim is to publish Recipe Book v2 by end 2013 It is anticipated that this will add to v1 by including: – New group 2 recipes covering additional areas of Snowball Metrics Landscape… – Adoption of existing standards Translation of group 1 metrics into CERIF (Common European Research Information Format), a common language produced by euroCRIS that supports data sharing between different tools – Enriched group 1 recipes Metric update and data governance approaches National (non-UK) versions 16
17
Global vs national standards for benchmarking Snowball Metrics start life with a national perspective – currently UK The aim is to promote all aspects of Snowball Metrics as far as possible to a global standard 17 UK metrics A.N.Other metrics Elsewhere metrics Illustrative only, testing underway Common core where benchmarking against global peers can be conducted Shared features where benchmarking between Elsewhere and A.N.Other, but not UK, can be conducted National peculiarity can support benchmarking within Elsewhere, but not globally
18
THANK YOU FOR YOUR ATTENTION! Contact Dr Lisa Colledge l.colledge@elsevier.com or snowballmetrics@elsevier.com l.colledge@elsevier.com snowballmetrics@elsevier.com
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.