Presentation is loading. Please wait.

Presentation is loading. Please wait.

Upping Our Game: Leading on Transformational Analytics & Getting Off the Hits Train Unlocking New Value from Content Stephen Abram, MLS

Similar presentations


Presentation on theme: "Upping Our Game: Leading on Transformational Analytics & Getting Off the Hits Train Unlocking New Value from Content Stephen Abram, MLS"— Presentation transcript:

1 Upping Our Game: Leading on Transformational Analytics & Getting Off the Hits Train Unlocking New Value from Content Stephen Abram, MLS Stephen.abram@gmail.com stephenslighhouse.com

2 Are you on the ‘HITS’ train?

3 BIG DATA

4 QUALITATIVE INFORMATION QUANTITATIVE DATA and

5 STATISTICS MEASUREMENTS and

6 What do we do when our buyers are asking for data that does not align with their goals?

7 Have Journal Prices Really Increased Much in the Digital Age? (Scholarly Kitchen blog) http://bit.ly/11b3hP2http://bit.ly/11b3hP2

8 Good Questions  What if prices of the predominant journal form have actually been falling?  What if we’ve been measuring the wrong things, or measuring insufficiently?  And what if the growth in expenses are not the result of price increases but a result of the growth in science?”

9 The Real Digital Story  Print subscription prices are a misleading and inaccurate method for tracking library serials spending  “... libraries’ spending on periodicals has increased three-fold while their collections have tripled in size... Spending three times as much to get three times as much tells a very different story from the “price increases” story....”  Published article output and research spending has grown 3.o% to 4% per year since 1990

10

11 And this is all means?  We’re playing a fool’s game when we play the raw statistics game.

12

13 Are you locked into library financial mindsets?

14 What about value and impact?

15 Or shall we stick with this?

16 Grocery Stores

17

18

19 Cookbooks, Chefs...

20

21 Meals

22

23

24

25

26

27 What do we count and share?  Titles  Clicks  Downloads  Sessions  Session length  COUNTER, (Counting Online Usage of Networked Electronic Resources)  SUSHI, Standardized Usage Statistics Harvesting Initiative  etc.

28 Or should we measure?  Was there improved customer satisfaction?  Did learning happen?  Was there an impact on research or strategic outcomes?  Did the patient live, improve, survive, thrive?  Did we impact discovery, creation, patents...?  Do librarians or types of end users have different values and behaviours?

29 Algorithms  Search differentiator  Reducing ‘clicks’ & downloads is good  Commercial algorithms versus those based on big data  Measuring end user success versus known item retrieval…  “Romeo and Juliet”  Problems with the unmonitored trial  Wrong tests  Poor sampling  Mindset issues

30 Sharing Learning and Research  Satisfaction and change  Usability versus User Experience  End users versus librarians  Known item retrieval (favourite test) versus immersion research  Lists versus Discovery  Scrolling versus pagination  Devices and browsers and agnosticism  Individual research experience vs. impacts on e-courses, LibGuides, training materials, etc.

31 Gale Analytics

32 Focus and Understand on the Whole Experience

33 Inside Lego™ Pieces  Foresee satisfaction and demographic data  Counter & Sushi data  Database usage (unique user, session, length of session, hits, downloads, etc.)  Google Analytics  Search Samples  ILS Data  Geo-IP data

34 What We Know (US/Canada)  27% of our users are under 18.  59% are female.  29% are college students.  5% are professors and 6% are teachers.  Daily, 35% of our users are there for the very first time!  Only 29% found the databases via the library website.  59% found what they were looking for on their first search.  72% trusted our content more than Google.  But, 81% still use Google.

35 Statistics, Measurements and Analytics Counter & Sushi data are very weak metrics that don’t provide insights into the critical stuff Database usage (unique user, session, length of session, hits, downloads, etc.) Web and Google Analytics (6,000+ websites) Foresee satisfaction and demographic data Search Samples (underemphasized at this point.) Time of Year Analysis ILS Data (from clients &n partnerships) Geo-IP data, analytics and mapping. Impact studies and sampling. 35

36

37

38 Who are our audiences? Librarians (several languages management, reference, acquisitions, systems, LMS, etc.) Institutional information technology and systems professionals eLearning professionals and developers Web design professionals Library Management team & Chief Librarian City or University administration, Provosts  Or End users?

39 Analytics

40 What do we need to know, real info?  How do library databases compare with other web experiences and expectations?  Who are our core virtual users?  What are user expectations for satisfaction?  How does library search compare to consumer search like Google?  How do people find and connect with library virtual services?  What should we ‘fix’ as a first priority?  Are end users being successful in their POV?  Are they happy? Will they come back? Tell a friend?

41 Gale Library Databases Compare Very Well to Other Web Experiences

42 Library Search Needs to Improve

43 Library Users Trust Library Databases More.

44 Wow! Only 29% of Users Find E-Resources Through Library Websites.

45 And 39% of Your Users Are in Our Databases for For the Very First Time!

46 Gale Analysis: Mobile March 26 – September 25, 2012

47 End Users Are Likely to Return

48 End Users Evaluate Our Services as Meeting Expectations

49 The School Cycle Drives Many Usage Scenarios

50 EDUCATE and Lead

51 Quick Poll Should our industry 1. Invest in the development and promotion of a suite of end-user impact and value measurement tools that actually communicate the value in our products? Should our industry 2. Just deliver the raw statistics that customers are asking for and let them perform the analyses independently?

52 Until lions learn to write their own story, the story will always be from the perspective of the hunter not the hunted.

53 Stephen Abram, MLS, FSLA Consultant, Dysart & Jones/Lighthouse Partners Cel: 416-669-4855 stephen.abram@gmail.com Stephen’s Lighthouse Blog http://stephenslighthouse.com Facebook, Pinterest, Tumblr: Stephen Abram LinkedIn / Plaxo: Stephen Abram Twitter: @sabram SlideShare: StephenAbram1


Download ppt "Upping Our Game: Leading on Transformational Analytics & Getting Off the Hits Train Unlocking New Value from Content Stephen Abram, MLS"

Similar presentations


Ads by Google