Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.

Slides:



Advertisements
Similar presentations
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Advertisements

Assessment is about Quality AMICAL CONFERENCE – MAY 2008 Ann Ferren, Provost, American University in Bulgaria.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Evaluation of NCknows, a Statewide Collaborative Chat Reference Service: What Users and Others Told Us Charles R. McClure, Francis Eppes Professor, and.
BIX - The Library Index Roswitha Poll Chair of ISO TC 46 SC 8: Quality – Statistics and performance evaluation Roswitha Poll Chair of ISO TC 46 SC 8: Quality.
The Impact of Consortial Purchasing on Library Acquisitions: the Turkish Experience Tuba Akbaytürk 24 th Annual IATUL Conference Ankara, Turkey.
NCknows Evaluation Overview Jeffrey Pomerantz, Lili Luo School of Information & Library Science UNC Chapel Tapping the vast reservoir of.
Changes in Library Usage, Usability, & User Support Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie.
Library Statistics: what’s needed and what’s new Lynn Copeland Simon Fraser University Library Thurs. March 15, 2007 Vancouver Ass’n of Law Libraries.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Assessment with LibQUAL+ ™ at the University of Vermont Vermont Library Association College and Special Libraries Section Conference April 7, 2006 Selene.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 Writing Outcomes Produced by Non-Instructional Subcommittee of Assessment Committee.
Best-Fit Evaluation Strategies: Are They Possible? John Carlo Bertot, John T. Snead, & Charles R. McClure Information Use Management and Policy Institute.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Evaluating and Purchasing Electronic Resources- The University of Pittsburgh Experience Sarah Aerni Special Projects Librarian University of Pittsburgh.
E-Xploring Virtual Reference Education Institute 2 November 2006.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
LibQUAL+ and Beyond: Using Results Effectively 23 rd June 2008 Dr Darien Rossiter.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
Project web site: old.libqual.org LibQUAL+™ from a Technological Perspective: A Scalable Web-Survey Protocol across Libraries Spring 2003 CNI Task Force.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
ODINCINDIO Marine Information Management Training Course February 2006 Evaluating the need for an Information Centre Murari P Tapaswi National Institute.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
Ask A Librarian and QuestionPoint: Integrating Collaborative Digital Reference in the Real World (and in a really big library) Linda J. White Digital Project.
Old.libqual.org fairytale A fairytale about “ 22 items and a box ” presented by Martha Kyrillidou January 2004 Glasgow, UK.
Types of Assessment Satisfaction of the customer. Satisfaction of the worker. Workflow effectiveness and speed. Service delivery effectiveness and speed.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Surveying patrons with the Impact Survey A fast, easy way to gather feedback from the community about public technology needs Samantha Becker, MLIS, MPA.
Outcome Based Evaluation for Digital Library Projects and Services
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
The Process of Conducting Research
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
Florida’s Outcome-Based Evaluation Program Finding the Fit with Multitype Library Cooperatives Sondra Taylor-Furbee June 15, 2002.
Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis (MaxData) Carol Tenopir and Donald.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
Old.libqual.org A fairytale about “ 22 items and a box ” presented by Martha Kyrillidou May 24, 2004 Medical Library Association Washington, DC.
Monitoring and Evaluation of Electronic Resource Use Unit 1.0: Introduction.
NCknows Challenges “There is only one little problem, of course. We are not on the Web, or we are not on it in any kind of a meaningful way. That position.
Project 3 Supporting Technology. Project Proposal.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATION Organized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
Virtual Reference in CARL Libraries Susan Beatty Head Information Commons University of Calgary Library Peggy White Head Science & Technology Liaison Services.
LibQUAL+ ® Survey Administration LibQUAL+® Exchange Northumbria Florence, Italy August 17, 2009 Presented by: Martha Kyrillidou Senior Director, Statistics.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
ASSESSMENT PRACTICES FOR SUMMER READING PROGRAM BY JACQUELINE CERON LIS 793.
A worldwide library cooperative OCLC Online Computer Library Center QuestionPoint Institution Administration QuestionPointTraining Russian State Library.
ASSESS THIS! What, How and Who Cares? ACRL/DVC Workshop Penn State Great Valley April 16, 2010 ASSESS THIS! What, How and Who Cares? ACRL/DVC Workshop.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
OCLC Online Computer Library Center 1 Using Library Perception Information and Impact Data.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
LibQUAL+ TM Library Survey LIBQUAL+ “ Only customers judge quality – all other measures are irrelevant”
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Empowering Data: Persuasion Through Presentation
Tell a Vision: 3 Vignettes
Assessing your total rewards offer
Markku Laitinen, Planning Officer, National Library, Finland
Benchmarking Reference Data Collection
Using LibQUAL+® as a Foundation for the Library’s Support of
Presentation transcript:

Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE Ithaca, NY

 What is Service Quality? ◦ Simply put, service quality is defined as “how well a service or activity is done”. Bertot, John Carlo, “Measuring Service Quality in the Networked Environment: Approaches and Considerations”, Library Trends, Spring 2001  What is Service Quality Assessment? ◦ Assessment or evaluation is the process through which we determine service quality

 Gap Model Hernon, P. based on A. Parasuraman, Valarie A. Zeithaml and Leonard L. Berry  Statistics, Performance Measures and Quality Standards C. McClure, R. D. Lankes, M. Gross, B. Choltco-Devlin

 Determining "the gap between customer expectations of those services of an academic or public library in general and those perceptions of the services offered by a particular academic or public library." Delivering Satisfaction and Service Quality: A Customer-Based Approach for Libraries, Peter Hernon and John R. Whitman. Chicago: American Library Association, 2001

 Using a variety of methodologies which can be customized and also applied to the service as a whole or specific components of the service. ◦ Statistics - Sets of raw numerical data ◦ Performance Measures - Measurements for evaluation purposes utilizing points of comparison using statistics or data from qualitative analysis ◦ Quality standards - A specific statement of the desired or expected level of performance that should be provided by a service or some aspect of that service McClure, Lankes, Gross, Choltco-Devlin

 Determine the degree to which service objectives are being met  How well do virtual reference services support larger organizational objectives  Produce trend data  Analyze costs vs. benefits within service-accountability  Compare cost/benefit in relation to other services

 Planning  Marketing/Promotion  Inform  Identify areas for improvement  Encourage thinking in terms of outcomes, results and most importantly impacts.

 Number of virtual reference queries received  Number of answers given  Total transactions  Number of questions received  Number of questions received virtually but not answered or responded to by completely virtual means (e.g., referrals, phone calls)  Type of virtual question received  Number of lost connections  Average transaction time  Number of repeat users  Number of abandoned transactions

 Percentage of virtual reference question per total reference questions  Sources used per question  Virtual reference question correct answer fill rate  Saturation rate (easier for academic libraries than public and very tricky in a virtual reference cooperative) ◦ The measure of the ration of digital reference service users who are members of a target population to total number of members of a target population

 Log and Report Analysis ◦ AskUs 24/7 and OCLC Questionpoint provide excellent methods for gathering data and generating reports ◦ Can look at such things as number of sessions, user’s browser, platform, service by time of day and/or day of week. ◦ If doing other types of virtual reference (i.e., , texting, im) these measures are also useful

 User satisfaction is closely related to but not exactly the same as service quality  These are often qualitative rather than quantitative ◦ Awareness of service ◦ Accessibility ◦ Expectations for service ◦ The transaction process ◦ Reasons for use ◦ Reasons for non-use ◦ Improvements needed ◦ Impact on user

 Raw cost of the service (subscription fees, software licensing)  Allied costs (personnel, tech support, etc.)  Cost as percentage of total reference budget  Cost as percentage of total organizational budget

 Training  Skill sets – what special skills are required to engage in a virtual reference transaction that are distinct from traditional face to face reference  Schedule allocation  Tech support

 Quality digital resources  Ease of accessibility  Authentication for premium resources  Licensing

 A quality standard is a specific statement of the desired or expected level of performance that should be provided by a service or some aspect of that service

 Users of the virtual reference service will rate the courtesy of the library staff as a score of X on a scale of 1 to 7 with 1 being discourteous to 7 being discourteous ◦ This quality standard will address the special issues arising from a virtual reference transaction where traditional non-text cues are absent and expectations regarding courtesy and how it is manifested can be radically different than that of a traditional reference transaction

 Requests for information on a patron’s pin number will only be referred to patron’s own library if information is not readily available on the patron’s library policy page in a consortium environment  Virtual reference transactions will have a 95% correct fill rate  90% of questions marked as follow-up will be responded to within 24 hours.  100% of reference questions will be responded to (but not necessarily completed within 24 hours of receipt

 Virtual reference service evaluation: Adherence to RUSA behavioral guidelines and IFLA digital reference guidelines ◦ ◦ This link is useful for information about the book itself but also the reference list at the bottom

 LIBQUAL+ and LIBQUAL-Lite (coming 2010) ◦ ◦ What is LibQUAL+®? LibQUAL+® is a suite of services that libraries use to solicit, understand, and act upon users’ opinions of service quality. These services are offered through the Association of Research Libraries (ARL) New Measures and Assessment Initiatives. The program’s centerpiece is a rigorously tested Web-based survey bundled with training that helps libraries assess and improve library services, change organizational culture, and market the library. LibQUAL+® enables systematic assessment and measurement of library service quality, over time and across institutions.

 EDMS (Evaluation Decision Making System) ◦ ◦ The EDMS is a centralized public access portal designed to provide information related to evaluation of a library's services and resources for management and advocacy purposes. Information provided includes: ◦ Types of evaluation methods typically used to assess the use of services and resources; ◦ Data each type of evaluation can provide; ◦ How to plan for data collection efforts and tips on how to analyze the data; and ◦ Strategies on how to apply the results of evaluation efforts for management and advocacy purposes.