Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.

Similar presentations


Presentation on theme: "Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013."— Presentation transcript:

1 Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013

2 FACT 1. Challenges for libraries increase every year Less resources in general (at least not more) Budget cut downs -> value for money Retired staff positions not filled More workload for everybody Constant reorganization to find ways of working efficiently More and more electronic media taking over from print -> challenge for librarians to convince funders what actually costs in the library More complicated system infrastructure -> more difficult to know the content and data in different system Knowledgebases, A-Z-lists, Databases etc -> more specialised staff needed to work in the new environments

3 Reality for library users COMPUTER IPHONE IPAD WWWWWW Fulltext articles LIBRARY??

4 Reality for librarians

5 FACT 2. More and more numbers Management want more and more numbers and quantity metrics from the library to get some proof of the use of costly e-resources. They want decision making based on use and it is very difficult to compare between systems or the numbers from different types of e-material, for example e-books.

6 FACT 3. Transition from P to E still goes on Not all libraries have been able to change staff focus from print to electronic If they have changed focus there is the problem of training staff who are not always up to new work tasks or they think it´s very difficult Large libraries have to be in two worlds at the same time – the printed and the electronic – different needs - that costs! Electronic books need to be handled as detailed as printed books – but are they? Electronic material is much more moody and instable than printed stuff

7 What can libraries do? 1.Collection of data sources – a package, an aggregated database, an A-Z-list 2.Decide on a statistic analytics tool or several models 3.Analyze and interpret the data and compare with other data

8 Evaluation Libraries can create an Evaluation strategy Libraries need to regularly evaluate databases, journals and e-books. Perhaps with different methods for each type or a combination There are many different evaluation models and tools

9 Tools and models COUNTER (Counting Online Usage of Networked Electronic Resources) Measure downloads of articles, books, chapters HELP!! SUSHI (Standardized Usage Harvesting Initiative) TRANSFER code of practice KBART - a standardized format to exchange and update product availability details across the supply chain

10 Bibliometric analysis tools – Mathematics for librarians ISI Impact factors –measure content relevance, citation Eigenfactor score - mapping the structure of academic research within networks PageRank algorithm - a numerical weighting to each element of a hyperlinked set of documents Journal Usage Factor (JUF) - will provide information about the average use of the items in an online journal Altmetrics - new and fast ways of measuring research, outside the scope of traditional filters. Based on social media. What can libraries do?

11 How? Reviewing journal packages and products Ongoing evaluation process in a whole year Extremely time consuming for the library.. Using statistics tools on the market or models for electronic resources Asking your users about what they think about the e- resources

12 TERMS – Techniques for ER Management

13 Projects done ”Analysstöd för statistik över användandet av elektroniska resurser” Project in Sweden funded by the Royal library in Sweden 2012: Analysis tool created KPI indicators: Templates were created: E-books, E-journals, Databases to help find indicators to evaluate e-resources in a more wide way than just looking at cost per use

14 Projects done “Measures of health sciences journal use: a comparison of vendor, link resolver and local citation statistics” Journal of the Medical Libraries Association April 2013. The study was done at University of Illinois at Chicago (UIC). They use Serials Solutions 360 Link and A-Z-list and MARC records.

15 Challenges gathering data Counter reports – difficult to collect, not enough standards, E-books Impact factor - highly discipline-dependent Many journals that are used are not cited at all! Archives – often bought separate from packages and single subscriptions More and more hybrids of the material Numbers: Resources that has a low use but are critical to just a few researchers? How do you evaluate them?

16 Challenges gathering data A library may have access to many different platforms for a single journal Problems with the library´s vendor accounts on the platform Problems with the KnowledgeBase setup Many different systems in the library with same titles but structured in different ways Subscriptions set up under multiple accounts can result in multiple reports from the same platform Vendor packages change every year: cancelled titles, changed titles, new titles, transferred titles

17 More challenges.. Human errors Not all e-resources deliver any statistics at all.. Extremely TIME CONSUMING

18 What can vendors do? Deliver standardized data to the systems Listen to libraries when developing statistic tools Try to understand better what libraries really need the data for

19 My experience and thoughts 1. Why is the use of e-resources so much more discussed than print material ever was? Who question an expensive reference book on the shelf and the quantity cost per use? Why not? 2. System Up to date information = higher usage! 3. E-resources Accessibility = higher usage!

20

21

22 My experience and thoughts 4. E-resources evaluation should be a part of the whole library evaluation, not a separate part. Why do we buy these resources and systems and how can we prove it helps users? 5. Today there is a difficult landscape how to find accurate data to evaluate! Who is doing it?

23 Thoughts or expericences? How do you work with evaluation in your library? How would you like vendors to help? Thank you!


Download ppt "Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013."

Similar presentations


Ads by Google