Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented.

Similar presentations


Presentation on theme: "Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented."— Presentation transcript:

1 Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented By: Terry Plum, Simmons GSLIS Brinley Franklin, University of Connecticut Martha Kyrillidou, ARL Gary Roebuck, ARL MaShana Davis, ARL Library Assessment Conference 2008 University of Washington Seattle, WA August 4, 2008

2 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program MINES for Libraries® Background  1982 First indirect cost study – in-house  1986 Agreement with DHHS  1998 CARL surveying Medline and FirstSearch  1999 ARL New Measures – Terry Plum  2003 ARL adopts MINES into StatsQUAL ®  2005 ARL interactive data with OCUL  2007 ARL in discussion with EZproxy  2008 IMLS grant application

3 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program  The most popular current method of measuring usage of electronic resources by libraries is not through web- based usage surveys, but through vendor supplied data of library patron usage or transaction usage.  Web-based usage surveys are increasingly relevant in the collection of usage data to make collection development and service decisions, to document evidence of usage by certain patron populations, and to collect and analyze performance outputs. Brinley Franklin and Terry Plum, “Successful Web Survey Methodologies for Measuring the Impact of Networked Electronic Services (MINES for Libraries ® )” IFLA Journal 32 (1) March, 2006 Measuring Digital Content Use

4 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program ARL New Measures Toolkit: StatsQUAL ® LibQUAL + ® is a rigorously tested Web- based survey that libraries use to solicit, track, understand, and act upon users‘ opinions of service quality. LibQUAL + ® DigiQUAL ® The DigiQUAL ® online survey designed for users of digital libraries that measures reliability and trustworthiness of Web sites. DigiQUAL ® is an adaptation of LibQUAL + ® in the digital environment. MINES for Libraries ® Measuring the Impact of Networked Electronic Resources (MINES) is an online transaction- based survey that collects data on the purpose of use of electronic resources and the demographics of users. ARL Statistics™ ARL Statistics™ is a series of annual publications that describe the collections, expenditures, staffing, and service activities for Association of Research Libraries (ARL) member libraries. ClimateQUAL™ ClimateQUAL™: Organizational Climate and Diversity Assessment is an online survey that measures staff perceptions about: (a) the library's commitment to the principles of diversity, (b) organizational policies and procedures, and (c) staff attitudes.

5 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program What is MINES?  Action Research  Historically rooted in indirect cost studies  Set of recommendations for research design  Set of recommendations for web survey presentation  Set of recommendations for information architecture in libraries  Plan for continual assessment of networked electronic resources  An opportunity to benchmark across libraries

6 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program What is MINES? (cont’d)  Measuring the Impact of Networked Electronic Services (MINES)  MINES is a research methodology that measures the usage of networked electronic resources of a library or consortium by a specific category of the patron population.  MINES is a Web-based survey form consisting of 5 questions that is administered at the time of transaction.  MINES measures:  User status and discipline/affiliation (who)  Physical location (where)  Primary purpose and reason of use (why)  MINES a part of Association of Research Libraries’ New Measures & Assessment Initiatives (since 2003).  MINES is different from other electronic resource usage measures that quantify total usage (e.g., Project COUNTER, E-Metrics) or measure how well a library makes electronic resources accessible (LibQUAL+ ® ).

7 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Recent Data Collection Activities via ARL  Ontario Council of University Libraries (OCUL)  University of Iowa Libraries  University of Macedonia

8 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Questions Addressed  How extensively do sponsored researchers use the new digital information environment?  Are researchers more likely to use networked electronic resources from inside or outside the library?  Are there differences in usage of electronic information based on the user’s location (e.g., in the library; on-campus, but not in the library; or off-campus)?  What is a statistically valid methodology for capturing electronic services usage both in the library and remotely through web surveys?  Are particular network configurations more conducive to studies of digital libraries patron use?

9 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Reliance on Vendor Statistics Vendor statistics, while more reliable than in the past, are still maturing.

10 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Library User Survey

11 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Library User Survey Patron Status

12 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Library User Survey Affiliation

13 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Library User Survey Location

14 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Library User Survey Purpose

15 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Issues with Web surveys  Research design  Coverage error  Unequal access to the Internet  Internet users are different than non-users  Response rate  Response representativeness  Random sampling and inference  Non-respondents  Data security

16 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program MINES Strategy  A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library.  Random moment/web-based surveys are employed at each site.  Participation is usually mandatory, negating non- respondent bias, and is based on actual use in real-time.  IRB waiver or approval  Libraries with database-to-web gateways or proxy re- writers offer a comprehensive networking solution for surveying all networked services users during survey periods.

17 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program MINES Strategy  Placement  Point of use  Not remembered, predicted or critical incident  Usage rather than user  What about multiple usages  Time out ?  Cookie or other mechanism with auto-population or more recently counting invisibly with a time out.  Distinguish patron association with libraries.  For example, medical library v. main library.  But what if the resources are purchased across campus for all. Then how to get patron affiliation?

18 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Web Survey Design Guidelines  Web survey design guidelines that MINES followed:  Presentation  Simple text for different browsers – no graphics  Different browsers render web pages differently  Few questions per screen or simply few questions  Easy to navigate  Short and plain  No scrolling  Clear and encouraging error or warning messages  Every question answered in a similar way - consistent  Radio buttons, drop downs  ADA compliant  Introduction page or paragraph  Easy to read  Must see definitions of sponsored research.  Can present questions in response to answers – for example if sponsored research was chosen, could present another survey

19 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Quality Checks  Target population is the population frame – surveyed the patrons who were supposed to be surveyed - except in libraries with outstanding open digital collections.  Check usage against IP. In this case, big numbers may not be good. May be seeing the survey too often.  Alter order of questions and answers, particularly sponsored and instruction.  Spot check IP against self-identified location  Spot check undergraduates choosing sponsored research – measurement error  Check self-identified grant information against actual grants  Content validity – discussed with librarians and pre-tested.  Turn-aways – number who elected not to fill out the survey  Library information architecture -- Gateway v. HTML pages – there is a substantial difference in results.

20 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program

21 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program What is a session? Session A successful request of an online service. It is one cycle of user activities that typical ly starts when a user connects to the servi ce or database and ends by terminating a ctivity that is either explicit (by leaving the service through exit or logout) or implicit (t imeout due to user inactivity) (NISO) Counter Code of Practice: Journals and Databases. Release 3. Glossary of Terms.

22 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Federated searches are different Search and session activity generated by federated search engines and other automated search agents should be categorized differently from regular searches. Any searches or sessions derived from any federated search engine (or similar automated search agent) should be included in separate “Searches_federated” and “Sessions_federated” counts….and are not to be included in the “Searches_run” and “Sessions” counts. Counter Code of Practice: Journals and Databases. Release 3. Glossary of Terms.

23 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Sample Survey Data File Generated Other UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://newfirstsearch.oclc.org/done=referer;dbname=WorldCat;autho= ;FSIP12:36:5012/3/2004Off CampusUConn Faculty Family StudiesInstruction/Education/Departmental (Non-Funded) Researchhttp://www.jstor.org/cgi-bin/jstor/gensearch12:37:4312/3/2004Off CampusUConn Undergraduate Student Non-UConnOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First13:08:4112/3/2004Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html13:31:2912/3/2004Off CampusNon-UConn Non-UConnOther Activitieshttp://magic.lib.uconn.edu/index_real.html12:11:0612/3/2004Off CampusNon-UConn Agriculture & Natural ResourcesInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html12:33:5712/3/2004 Off CampusNon-UConn EducationInstruction/Education/Departmental (Non-Funded) Researchhttp://www.euromonitor.com/womdas/12:57:4412/3/2004 Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi-bin/Pwebrecon.cgi?DB=local&PAGE=First13:28:5212/3/2004 Off CampusNon-UConn Business AdministrationOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:56:4612/3/2004In the LibraryUConn Faculty Liberal Arts & SciencesOther Activitieshttp://www.siam.org/journals/simax/simax.htm12:52:1712/3/2004On Campus - StorrsUConn Graduate Student EngineeringInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:04:3112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/pqdweb?RQT=31812:16:3312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://www.jstor.org/journals/ html12:16:5212/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:29:5312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:48:4112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/login?COPT=SU5UPTAmVkVSPTImREJTPTE3MjErMysxNkJD&clientId= :04:2312/3/2004O n Campus - StorrsUConn Graduate Student

24 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program MINES for Libraries ® Implemented via StatsQUAL ® Dataflow Diagram

25 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program MINES for Libraries ® Implemented via StatsQUAL ® Cross-Functional Flowchart

26 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Reporting  Streamline data collection  Standardized reporting  General overview  Frequencies and cross tabulations for common variables  Who is accessing electronic resources? (user status and affiliation)  Where are they accessing electronic resources? (location)  Why are they accessing electronic resources? (purpose/reason of use)

27 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program *72% of sponsored research usage of electronic resources occurred outside the library; 83% took place on campus. On-Campus, Not in the Library n = 9,460 In the Library n = 9,733 Off-Campus n = 7,790 Overall Use n = 26,983 66% MINES for Libraries ® Purpose of Use by Location U.S. Main Campus Libraries Sponsored Research Instruction Other Other Sponsored Activities

28 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Online Data Analysis  Expand data analysis capabilities  Multidimensional data views  “Drag and drop”  Cross tabulations  Frequencies  Longitudinal analysis

29 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Web Interface

30 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program

31 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program

32 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Post Survey Consulting  Help institutions:  Identify common themes in their data  Identify answers to research questions  Identify next steps according to initial goals and/or research questions

33 MINES for Libraries® Association of Research Libraries Statistics and Measurement Program Resources  Web interface  ARL New Measures and Assessment Initiatives: MINES for Libraries ®  Articles & Presentations


Download ppt "Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented."

Similar presentations


Ads by Google