Presentation is loading. Please wait.

Presentation is loading. Please wait.

The JISC Usage Statistics Portal Ross MacIntyre, Mimas The University of Manchester.

Similar presentations


Presentation on theme: "The JISC Usage Statistics Portal Ross MacIntyre, Mimas The University of Manchester."— Presentation transcript:

1 The JISC Usage Statistics Portal Ross MacIntyre, Mimas The University of Manchester

2 Personal timeline 1996-98 SuperJournal 1998 Nesli 2000 UKSG w/shop, ICOLC, ARL, PALS, … 2001 ‘US Task Force’ 2002 COUNTER* 2003 *J&Db[R1] 2004 Evidence Base report: ‘Nesli2 Analysis of Usage Stats’ 2005 *J&Db[R2] 2006 *eB[R1], Key Perspectives report: Usage Stats Svc Feasibility Study, Mesur 2007 Content Complete report: JUSP Scoping Study, KE event – article level stats & COUNTER 2008 JISC ITT for JUSP Scoping Study 2 (prototype), *J&Db[R3], PIRUS 2009 JUSP Report, PIRUS2 2010 April JISC fund JUSP to service Areas for Discussion Different perspectives: Publishers, Aggregators, Learning Institutions, Commercial Organisations, Product Vendors... What do you want to monitor & why? What is “usage”? ‘Are you getting enough?’ What are you supplying/gathering? What do you do with it? What is your ‘holy grail’?

3 Context to assist and support libraries in the analysis of NESLi2 usage statistics and the management of their e-journals collections. 20 NESLi2 e-journal deals 132 HEIs taking up NESLi2 deals

4

5

6 JISC Collections USAGE STATISTICS PORTAL SCOPING STUDY: PHASE ii TECHNCIAL DESIGN AND PROTYPING INVITATION TO TENDER Summary 1.This invitation to tender invites bidders to submit proposals to undertake to the technical design and prototyping for a Usage Statistics Portal 2.The deadline for proposals is 12:00 noon on Monday 14 July 2008. The work should start no later than end of July 2008. The work a detailed technical specification and design for the Usage Statistics Portal, and a scoping of costs required to bring it to production, and final report should be complete by 1st March 2009.

7

8 3 Project Partners Evidence Base = Pete Dalton & Angela Conyers Paul Needham Ross MacIntyre & Sean Dunne

9 Aims of this JUSP study Refine user requirements for usage portal Develop prototype portal Obtain feedback from participating libraries on the portal developed Specify what would be required to scale the portal into a full service. To be based on COUNTER usage reports: JR1 (total number of full-text article requests) JR1A (requests from archives or backfiles) JR1 = Number of Successful Full-Text Article Requests by Month and Journal JR1a = Number of Successful Full-Text Article Requests by Month and Journal for a Journal Archive JR2 = Turnaways by Month and Journal JR3 = Number of Successful Item Requests and Turnaways by Month, Journal and Page Type JR4 = Total Searches Run by Month and Service JR5 = Number of Successful Full-Text Article Requests by Year of Publication and Journal DB1 = Total Searches and Sessions by Month and Database DB2 = Turnaways by Month and Database DB3 = Total Searches and Sessions by Month and Service

10 JUSP prototype 5 HE libraries University of Birmingham (A) Sarah Pearson Cranfield University (E) Simon Bevan University of Glasgow (A) Tony Kidd University of Liverpool (B) Terry Bucknell University of Westminster (C) Pat Barclay 3 NESLi2 publishers Elsevier, OUP & Springer 1Intermediary Publishing Technology (Ingenta)

11 User Rqts Initial (10) rqts turned in to specifications 2007&8 JR1 & JR1A stats for 3 Publishers Data obtained from the 5 Libraries (& 1 Publisher) Libraries also provided relevant aggregator / gateway stats All data parsed and loaded Authenticated (u/p) access provided to each Library for own stats Full access provided to JISC (Exec, Collections & JWG) Rqts refined and further (10) rqts specified

12 Technical – entity relationships PublisherAggregator Institution Journal Has Agreement PublishesSupplies Access ISSN 'hits' Users

13 Platforms Journals The database v0.4 Institutions Statistics DealsSuppliers Relationships key: One to many Many to many

14 The database v0.4 Supplier Table Supplier ID Name Type ContactName JobTitle Address Postcode Phone Fax Email Statistics Table Journal ID Institution ID Publisher ID Supplier ID Platform ID Type YYYYMM Accesses Platform Table Platform ID Name Journal Table Journal ID Title AltTitle ISSN eISSN Subjects Source Institution Table Institution ID Name Type LoginID ContactName JobTitle Address Postcode Phone Fax Email JISCBand Deal_Details Deal ID YYYY Journal ID Deals_Summary Deal ID Publisher ID Title Relationships key: One to many Many to many

15 Technical – conversion from.xls

16 Technical – conversion to.xml

17 What does it demonstrate? For libraries: Single point of access to own usage statistics Monthly figures presented in both calendar and academic years Addition where relevant of gateway/aggregator statistics Usage of current collections with backfiles removed Assistance with SCONUL statistical return Trend analysis, high use titles etc, publisher summaries etc For JISC Collections Access to usage statistics for all libraries and all NESLi2 deals

18 Link to JUSP Prototype

19 1. Single point of access to all JR1 and JR1A usage statistics as currently downloaded individually from publisher websites User informational text From this page, you can download JR1 and JR1A (archive) reports. You can select calendar year or academic year Interface shows Publisher – drop down list (Elsevier, Springer, OUP) Report – drop down list (JR1 (all), JR1A (archive only) Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec) Submit request Data processing notes for prototype Show title by title JR1 and JR1A (archive) reports in form that can be downloaded by library

20 2. Addition of aggregator/gateway JR1 statistics where relevant User informational text To get a full picture of usage you may need to add usage statistics provided by other services such as Swetswise. This will depend on the publisher. Select publisher and year to download JR1 reports with Ingenta, Swetswise, Ebsco EJS etc included where appropriate. Hyperlink note text If you use Ingenta, you will always have to add their JR1 usage statistics to those from the publisher to get a full record of use. For some publishers you will also have to add JR1 usage statistics from services like Swetswise Interface shows Publisher – drop down list (Elsevier, Springer, OUP) Report – drop down list (JR1 (all)) Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec)

21 3. Excluding usage of backfile collections User informational text JR1 reports include all usage. Some publishers also produce JR1A reports which give only usage of their archive or backfile collections. If you have access to these, you can download here reports that exclude backfile use and show only usage of current titles. Interface shows Publisher – drop down list (Elsevier, OUP, Springer (no JR1A report) Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec) Data processing notes for prototype Titles in JR1 and JR1A matched by ISSN. JR1A usage subtracted from JR1. Use the total JR1 figures as in Screen 2

22 4. Summary tables to show trends over time, compare publishers etc. User informational text Use these tables to look at usage trends over time, and to compare usage of the various publisher deals to which you subscribe. Interface shows Publisher – drop down list (all, Elsevier, Springer, OUP) Monthly, yearly – select whether you want to look at monthly or yearly stats Year – calendar years all, 2007, 2008, academic year 2007/08 Select a time period from & to Data processing notes for prototype Monthly - Use the monthly total JR1 figures as in Screen 2 Yearly – Use the total JR1 figures for year Academic year - use the total JR1 figures for year. Figures for SCONUL return.

23 5. Summary table to show use of aggregators User informational text Use this table to see how much of your total usage goes through Aggregators, e.g. Ingenta and Swetswise Interface shows Publisher – drop down list (Elsevier, Springer, OUP) Year – calendar years all, 2007, 2008, academic year 2007/08 Data processing notes for prototype Give separate columns for publisher, Ingenta, Swets, and total and show JR1 usage in each. Show percentage use from each source.

24 6. Summary table to show use of backfiles User informational text Use this table to see how much of your total usage comes from backfiles Interface shows Publisher – drop down list (Elsevier, Springer, OUP) Year – calendar years all, 2007, 2008, academic year 2007/08 Data processing notes for prototype Use JR1 total including aggregators. Use JR1A as in first screen. Show percentage of total JR1 usage that comes from JR1A

25 7. ‘Some more figures’ [sic] User informational text Find the average, median and maximum number of requests Interface shows Publisher – drop down list (all, Elsevier, Springer, OUP) Year – calendar years all, 2007, 2008, academic year 2007/08 Data processing notes for prototype Use total JR1 for each publisher, i.e. including aggregators Calculate average, median and maximum number of requests

26 8. What titles have the highest use? User informational text Find the titles which have the highest use Interface shows Publisher – drop down list (all, Elsevier, Springer, OUP) Year – calendar years all, 2007, 2008, academic year 2007/08 Display 20 titles with the highest usage (include publisher, title, issn, no. of requests, in descending order of no. of requests)

27 9. Tables and graphs User informational text See your monthly or annual usage over time as a chart Interface shows Publisher – drop down list (Elsevier, Springer, OUP) Monthly, yearly – select whether you want to look at monthly or yearly stats Year – calendar years all, 2007, 2008, academic year 2007/08 Or Select a time period from & to

28 10. Benchmarking User informational text Compare your usage with others in the same JISC band Interface shows Publisher – drop down list (all, Elsevier, Springer, OUP) Year – calendar years all, 2007, 2008, academic year 2007/08 Or Select a time period from & to Data processing notes for prototype Use yearly JR1 totals (including aggregator) for each publisher. Highlight in some way the requesting library. Give total for all libraries in the JISC band and average. Notes Solely for the purposes of the prototype, the libraries are grouped into two bands A and B. These are NOT the true JISC Collections bands for these institutions.

29 Usage Statistics Portal: Benchmarking functionality 76 Institutions responded to our short survey in reference to the usage statistics portal (benchmarking functionality). Our findings are as detailed below. Question 1: How useful would it be for you to benchmark your institution’s journal usage for each individual NESLi2 publisher against that of other HE institutions? (76 responses) 38 / 76 (50%) = Very useful 36 / 76 (47.4%) = Somewhat useful 2 / 76 (2.6%) = Not useful JISC Collections Benchmarking Survey – March 2010

30 Question 5. Regarding questions 2-4 above, please indicate which would be your preferred choice regarding benchmarking (74 responses) 37 / 74 (50%) = Named institution 23 / 74 (31.1%) = Listed anonymously (same JISC band) 14 / 74 (18.9%) = Average usage by institutions in the same JISC Band

31 Questions 10: Regarding questions 7-9 above, which would be your preferred choice? (74 responses) 37 / 74 (50%) = Being anonymised within my JISC Band 30 / 74 (40.5%) = Other institutions being able to see my institution's name 7 / 74 (9.5%) = Being part of an average figure for the Band I am in

32 Question 6. Is there any other benchmarking criteria you would like to see? Same ‘mission group’ - Russell group / by type of institution / pre and post ’92 / 94 group Select our own particular subset of named institutions Similar size and structure Usage, spend and budget for resources Cost per download & cost per FTE - Student and Staff at department / subject level SCONUL divisions (RLUK, old, new, collHE) and by area Scotland / Wales would also be useful Trend over a period of years

33 Question 11: Please add any additional comments you would like to make If OK with the licence then comparing named institutions would be best/ Happy to be named if all institutions are named Averages are not helpful unless accompanied by other institutional data. Anonymised usage figures would be more useful Institutions within the same JISC Band can vary widely (e.g. do they have a medical school, do they still have a chemistry dept) so you really need the institution name to give any sort of useful benchmarking. Pulling data like FTE and RAE would save us all from having to do that ourselves. Would be useful for NESLi2, however the majority of our deals are outside NESLi2

34 Q11 Continued “The COUNTER code of practice release 3 provides for consortium level reports (with named institutions). These reports are available to consortium managers and it seems to be the norm in consortia elsewhere to share this information freely within the consortium. We are the consortium, remember! Why this paranoia and coyness in the UK?”

35

36 Requirements not addressed in prototype 11.Getting price information for journals. Download list price of journals as supplied on the publishers website 12.Adding price information for journal lists. See your annual usage with information on list price for each journal 13.‘What titles are in the deal?’ 14.Adding deal information to journal lists. 15.Showing usage/non-usage of titles listed in the deal and titles not listed. 16.Summary table showing usage/non-usage of titles listed in the deal and not listed in the deal. 17.Summary table showing average and median use of titles listed in the deal and titles not listed. 18.Download area 1. Cost per request. 19.Download area 2. Usage of subscribed titles (tabular data) 20.Download area 3. Charts and graphs.

37 Outstanding Issues COUNTER R3 compliance as of Sept 2009 was low – now much higher Upload of publisher price lists – lack of machine- readable sources (maybe ONIX Serials – SPS?) Use subset of a Crossref journals list to populate the Journal and Supplier tables – nope Subject categorisation of journals – nope Identify which nil/low use journal not fully available within deal - nope

38 Where next? 30 th March 2010 - JISC accepted proposal to move from prototype to production Consortium: Evidence Base, Cranfield, Mimas and JISC Collections Accelerated development required – fully operational by end 2011 – all Libraries, all Publishers & Intermediaries.

39 ‘To Do’ List Production service Scaling up, more libraries, more publishers Implementing SUSHI Further development of database Further exploration of ‘added value’ services e.g. adding price, subject information, dealing with title changes, publisher transfers etc Further assistance to libraries in analysing own usage Authentication Benchmarking COUNTER for eBooks

40 Final Observations Open Source – available to institutions or other consortia Complementary not in competition with licensed software offerings

41 Q&A * With Apologies to CBS TV show “How I Met Your Mother.” Ross.MacIntyre@Manchester.ac.uk


Download ppt "The JISC Usage Statistics Portal Ross MacIntyre, Mimas The University of Manchester."

Similar presentations


Ads by Google