Presentation is loading. Please wait.

Presentation is loading. Please wait.

Guide to Usage Statistics

Similar presentations


Presentation on theme: "Guide to Usage Statistics"— Presentation transcript:

1 Guide to Usage Statistics
for Discovery Services, Databases, eJournals and eBooks The aim of this guide is to help you understand the terms used in usage statistics, the reports available, and how to use them to determine the cost effectiveness of your eResources.

2 Usage Statistics Key Reports
COUNTER 4 Reports (see p13 of Counter Online Metrics. Database report 1: Total searches, result clicks and record views by month and database. Can be within the same session. Database Report 2: Access denied by month, database and category Platform report 1: Total Searches, Result Clicks and Record Views by Month and Platform. Journal Report 2: Access Denied to Full-Text Articles by Month, Journal and Category Book Report 1: Number of Successful Title Requests by Month and Title Book Report 2: Number of Successful Section Requests by Month and Title Book Report 3: Access Denied to Content Items by Month, Title and Category Counter 5 Reports See also COUNTER 5 for Librarians. PR_P1: Platform Usage: total and unique item requests, as well as platform searches DR_D1: Database Search and Item Usage: total item investigations and requests, as well as searches DR_D2 Database Access: showing where users were denied access to databases TR_B1 Book Requests / TR_J1 Journal Requests: : fulltext activity for non Gold Open Access eBooks /eJournals TR_B2 Book Access Denied / TR_J2 Journal Accessed Denied: : where users were denied to eBooks / eJournals TR_B3 Book Usage by Access Type: / TR_J3 Journal Usage by Access Type: all applicable metrics by Access Type for eBooks or eJournals COUNTER reports are standard. Use them for reliable comparisons between electronic resources from different suppliers if applicable. Supplier reports (for help see EBSCO, ProQuest, BMJ Journals, Ovid) As well as the COUNTER reports, suppliers may offer reports which allow for customization or give more detail. May be useful for more detailed analysis of electronic resources from the same supplier. Some useful examples are: EBSCO: Database usage: sessions, searches, and requests logged for each databases Interface usage: the interfaces, such a Discovery, EBSCOhost from which a search of databases was conducted Link activity: link-outs and link-ins for each database. Title usage: request for content for each title in the collections purchased with database. ProQuest: Database Activity (summary/annual): Summary includes searches and document usage broken out by location and database, and number of articles retrieved from each search, broken down by the format provided (citation/abstract or any full text format). Detail does above plus breakdown by login. Searches (by time/by mode): total number searches by hour of the day for last 14 days / search mode Ovid: Top articles: the articles that the customer has viewed most. Usage Statistics

3 Usage Statistics Recommendations General Discovery eJournals
Where applicable, use COUNTER reports as they are standard and therefore allow for more reliable comparison of electronic resources from different suppliers COUNTER 5 reports give more detail, so use if available. Use supplier usage reports to give more granular information. Do not use OpenAthens usage data as it only counts number of sessions per platform. Cost analysis. When working out the cost/benefit of a database, remember to divide the cost of database by your chosen measure. Compare it to sourcing the same articles via ILL. Remember to use NHS Process Costing Framework to calculate costs accurately See page 3 for a glossary of terms and page 4 for a usage metric template. Discovery COUNTER reports are not particularly useful here. Instead use the discovery service (DS) standard usage reports or analytics. Use Searches Regular in preference to Searched Federated & Automated. The latter includes searches external to DS. Add Full-text requests and Link out requests together and combine with the number of regular searches. Full-text requests are those for supplier’s content. Link out requests are to content supplied from another source. Use COUNTER Database Report 2 /DR_D2 to check whether your users are hitting licence limits or accessing non-purchased databases or titles. Usage Statistics eJournals Using Journal Report 1/TR_J1, check total full-text downloads. Remember to check all access routes, for example direct access, via Discovery or eJournal collection. Use Journal Report 2 /TR_J2 to check whether your users are hitting licence limits or accessing a non-purchased databases. Databases (if not accessing them via HDAS) Check usage from all direct access routes to databases i.e. native interfaces and discovery services. No statistics are available from HDAS. Session statistics don’t tell you very much. Use total of regular searches and total requests from all sources, to determine activity. Numbers of searches may be just as useful as full-text requests. A user may find useful information without necessarily downloading anything; a lack of papers or results may be just as valuable. Using Database Report 1/DR_D1, add total of Regular Searches to Record Clicks. Use Database Report 2 /DR_D2 to check whether your users are hitting licence limits or accessing a non-purchased databases. eBooks Using Book Report 1/TR_B1, check total title requests. Use Book Report 3 /TR_B2 to check whether your users are hitting licence limits or accessing a non-purchased databases. Suppliers own usage reports will give more granular information such as chapter downloads and abstract views. COUNTER report only gives a total. You may find abstract requests.

4 Usage Statistics Glossary
Abstract Requests: number of times an abstract is requested. In EBSCO if a user hovers over the Abstract Preview feature, it will count as an abstract request. Full-Text Requests: number of times the full text is requested. Limit Exceeded: access denied statistic, counted when a user is unable to access a unique content item because their institution’s cap on the number of simultaneous users has been exceeded. Link-Out Requests: number of times a link takes you to a site external to the current site. If for example you are using EDS and the full-text is available from CINAHL, it would not count as a link-out, whereas a link to the Science Direct site would. No License: access denied statistic, counted when a user is unable to access a unique content item because their institution does not have a license to the content. Platform Search: A single search across three databases will add 1 to platform search and 1 to each of the three databases used. Record Views counts views of the detailed metadata or full database record. This view can come from a set of search results, from browsing the database, or from a click on another database record. Regular Searches: counts the number of searches of the database. Result Clicks shows the number of interactions performed by the user when viewing search results for that database and may include viewing the more detailed record, viewing the full-text or clicking on a link resolver. Searches Automated: the number of times a user searches a database, when it has not been actively selected. Searches Federated: the number of times a search is run remotely by a computer. Searches Platform: the number of times a user searches a database, regardless of the number of databases involved in the search. Searches Regular: the number of times a user searches a database, when they have actively chosen that database from a list of options OR there is only one database available to search. Sessions: counts the number of sessions. A session starts at login and stops at log out. One session may include multiple searches or requests. Total Item Investigations: the total number of times a content item or information related to a content item was accessed. Total Item Requests: Total number of times the full text of a content item was downloaded or viewed. Unique Item Investigations: the number of unique content items (e.g. chapters) investigated by a user. Total requests: total of all requests, whether full-text, abstract or linkout. Total searches: All searches, whether regular or platform. Turnaways: number of times access to a resource is denied. Unique Item Requests: Number of unique content items (e.g. chapters) requested by a user. Unique Title Investigations: the number of unique titles (e.g. books) investigated by a user. Sessions: counts the number of sessions. A session starts at login and stops at log out. One session may include multiple searches or requests. Unique Title Requests: Number of unique titles (e.g. books) requested by a user. Usage Statistics

5 Sample metric. A blank template and other examples may be found at kfh
Sample metric. A blank template and other examples may be found at kfh.libraryservices.nhs.uk/metrics Metric Definition: Cost/Usage of fulltext collection Why is it important? Evidence based service development | Support stakeholder targets | Determine the cost effectiveness of resource Process for compiling the Metric: Cost of fulltext database / Total downloads = Unit Cost. | Total downloads x average ILL cost = ILL Cost. Make the measure and time period clear, and ensure you’re using the same measure and time period if you’re comparing multiple resources. Average ILL cost obtained using process costing – see steps 1-5 below: Establish number of ILLs per source | 2. Establish time spent on processing each ILL in staff time by band | 3. Use the Hourly Rate Calculator to establish costs of staff at each band involved in the process. | 4. Multiply average time spent on all ILL by average cost of staff time. | 5. If cost of obtaining from resource higher than obtaining from ILL, consider benefit of resource. What does it mean? How do you interpret this metric? The metric indicates whether the cost of a resource is reasonable given the benefit being derived from it by LKS users. Benefit is defined as the usage being made of a resource. If the Unit Cost is considerably more than ILL Cost, then you may wish to consider promotion, renegotiation of price, or whether it’s worth renewing. You may also wish to include intangible considerations, such as immediate v delayed access, in your decision. Desired outcomes: What would improvement look like? Do you have a level you are required to reach or aiming for in a period? You might consider what would be Red, Amber, Green scores for a dashboard. Unit cost should be less than ILL cost, or showing a falling trend. If unit cost approaches ILL cost, promote the resource, or investigate users’ experiences of using it. Improvement plans: How do you plan to make a difference to this metric in a defined period? Review following feedback. Think about what your actions will be and make plans to track the change in the metric to measure the effectiveness" Reporting : Where and how do you plan to share the metric? Is it part of a dashboard or regular service monitoring report? You could embed a sample graph. The metric could be reported in the LKS annual report, a business case, or in supplier negotiations.


Download ppt "Guide to Usage Statistics"

Similar presentations


Ads by Google