BIX - The Library Index Roswitha Poll Chair of ISO TC 46 SC 8: Quality – Statistics and performance evaluation Roswitha Poll Chair of ISO TC 46 SC 8: Quality.

Slides:



Advertisements
Similar presentations
McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 2 Business Processes and Accounting Information.
Advertisements

Using statistics to enhance library performance, an example Anja Smit.
Statens senter for arkiv, bibliotek og museum Indicators for Public Libraries Libraries in Knowledge Society – Strategies for.
Dr. Roswitha Poll Münster Benchmarking with performance indicators and Balanced Scorecard.
Unit Name Goes Here Data This, Data That Using ISO to Develop Performance Indicators for Library Wide Planning & Data Comparison Presentation for.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
0 eAMS-Konto Integrated online service Benchmarking Working Group Peter Oberbichler Budapest, 06. April 2011.
Measuring the impact of new library services Dr. Roswitha Poll Münster.
Impact measures for libraries and information services Roswitha Poll Münster Bielefeld 2006 Bielefeld 2006.
Chapter 3 – Evaluation of Performance
CONTEXT Evaluation of Information Services. Topics of Day Mission Vision Goals and Objectives Standards Types of Metrics  Input  Output  Performance.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Library Statistics: what’s needed and what’s new Lynn Copeland Simon Fraser University Library Thurs. March 15, 2007 Vancouver Ass’n of Law Libraries.
Can we quantify the library’s influence? Creating an ISO standard for impact assessment Roswitha Poll Chair of ISO TC 46 SC 8: Quality – statistics and.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library June 17, 2002.
Benchmarking information skills training courses FOLIO InfoSkills Course.
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
Standardisation of library statistics Standardisation of library statistics Roswitha Poll Chair of ISO TC 46 SC 8: Quality – Statistics and performance.
Presenter: Ira Bray Monday, September 14, :00 noon to 1:00 p.m. Infopeople webinars are supported by the U.S. Institute of.
Introduction to Marketing © Objectives  To consider foundations for a good marketing strategy  To understand the steps needed to create a marketing.
Economic Aspects of Information Systems Updated 2015 MIS 2000 Information Systems for Management Instructor: Bob Travica.
What is a Usable Library Website? Results from a Nationwide Study Anthony Chow, Ph.D., Assistant Professor Michelle Bridges, Patricia Commander, Amy Figley,
Jacqui Dowd SCONUL Workshop University of Westminster 17 th November 2009 Performance Measures & Metrics at University of Glasgow Library.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU 1 Linking Quality to Strategy: Benefits of Balanced Scorecards.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
The Cedar Foundation Stephen Mathews – Chief Executive Stella Maguire – Head of Organisational and Service Development.
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
All Rights Reserved, Juran Institute, Inc. Transforming Your Health Care System into a Baldrige Winner.
Thomas J. Hennen Jr. Tennessee Library Association Conference March 2002haplr-index.com The Best Libraries in America? The Best Libraries in America? An.
Data Summary July 27, Dealing with Perceptions! Used to quantifiable quality (collection size, # of journals, etc.) Survey of opinions or perceptions.
AG Statistik BBS Library Statistics and Benchmarking in Switzerland W. Lochbühler ZHB Luzern - Central and University Library Lucerne, Switzerland Chair,
Collaborative Assessment: Using Balanced Scorecard to Measure Performance and Show Value Liz Mengel, Johns Hopkins University Vivian Lewis, McMaster University.
Chapter 9 Developing an Effective Knowledge Service
Libraries in Lithuania 2013 Julija Mazuoliene Ministry of Culture of the Republic of Lithuania, 2014.
Sampling in-library use Statistics in practice: Measuring and managing Sebastian Mundt Sebastian Mundt
Roswitha Poll Münster, Germany Ten years after: „Measuring Quality“ revised.
Ministry of State for Administrative Development Towards Meaningful ICT Indicators for Developing Countries Dr. Ahmed M. Darwish EGYPT Government and Education.
Group. “Your partner in developing future Lifelong Learners” UROWNE UNIVERSITY LIBRARY.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
TCA VENet | evaluation Dietmar Paier Center for Education and Economy, Research & Consulting Graz, Austria 3rd project meeting , Werl.
Key findings From Action Research Project. Action Research Project.
Benchmarking information skills training courses FOLIOz InfoSkills Course.
New measures for new services Indicators for quality, cost, and impact of electronic library services Roswitha Poll Münster LIDA 2007 LIDA 2007.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
SU Counting what matters To measure what counts Karin de Jager University of Cape Town October 2004.
© 2009 Pearson Prentice Hall. All rights reserved. Quality Cost.
The Balanced Scorecard
Library Network Services Twin cities Kristiina Hormia-Poutanen National Library of Finland.
Measuring the impact of Technology on Quality of Services and Operations in an Academic Library Ashok Kumar Sahu Senior Librarian, IIMT Gulam Rasul Asst.
Using OMB Section 508 reporting in addressing your agency's program maturity. How to Measure Your Agency's 508 Program.
Anmai University Library Reconstruction Project Team Three
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
An analysis of the performance of research libraries in Poland – project description Elżbieta Górska The Polish Librarians’ Association Leistungsmessung.
Welcome to AB140 Introduction to Management Unit 6 Seminar – Control Robin Watkins.
Dubai Statistics Center practices in Human Resources Management - What is DSC? - Planning - Performing - Measuring and Evaluating - Supporting Factors.
The Federal eGovernment of the United Arab Emirates the United Arab Emirates(20/06/2012) 1.
Assessing the impact of libraries Roswitha Poll Chair of ISO TC 46 SC 8: Quality – statistics and performance evaluation Roswitha Poll Chair of ISO TC.
From statistics to quality measures: evaluation of libraries Roswitha Poll Chair of ISO TC 46 SC 8: Quality – statistics and performance evaluation Roswitha.
, Helsinki. Performance indicators in Finnish Scientific Libraries Markku Laitinen The 2 nd Nordic & Baltic Meeting in Library Statistics Tallinn,
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Internal communication at Statistics Sweden Cecilia Westström Director Communication Department 29 June, 2012.
1 Strategic Management All libraries must look into the future to be able to develop long-term strategies to meet user needs.
Indicators for Public Libraries
Assessing Library Performance:
Balanced Scorecards in
Global library statistics – a new approach
Presentation transcript:

BIX - The Library Index Roswitha Poll Chair of ISO TC 46 SC 8: Quality – Statistics and performance evaluation Roswitha Poll Chair of ISO TC 46 SC 8: Quality – Statistics and performance evaluation Hamar 2010

a benchmarking system for public and academic libraries since 1999 (public libraries) resp (academic libraries) participation voluntary, but the indicators a must annual fee: 170 € organized by: Bertelsmann Foundation (start); German Library Association (DBV) Participants 2009: 257 libraries 177 public libraries (of ca ) 80 academic libraries (Germany: 68 of ca. 350; 12 from Austria)) BIX – The Library Index

2 separate indices (public/academic) 17 indicators each 4 dimensions = Balanced Scorecard resources customer focus (use) efficiency development (potentials) BIX – The Library Index Kaplan, R.S./Norton, D.P.: Kaplan, R.S./Norton, D.P.: The Balanced Scorecard: Translating Strategy into Action. Boston 1996

4 dimensional scores and ranks 1 overall score and ranked list 8 categories - public libraries by size (5 categories) - academic libraries by organisation - integrated library systems - two-tier-systems (central libraries only) - polytechnics BIX – Ranking Indicators for academic libraries are not weighted !

Library BIX results Library Total rank Total score resources rank and score resources rank and score customer focus rank and score customer focus rank and score Indicator values Indicator values Indicator values Indicator values efficiency rank and score efficiency rank and score

► BIX-magazine ► press releases ► blueprints for participants‘ press releases ► seminars ► individual analysis /consultancy ► BIX website BIX – Publication of results bibliotheksindex.de/

BIX- Magazine Tables of results Best Practices Stories Testimonials/ Interviews

Methods of quality assessment in libraries Performance indicators measure the effectiveness and cost- efficiency of library services: quantitative, objective User satisfaction surveys measure the perceived quality, the users‘ impression of library services: qualitative, subjective Outcome assessment tries to show the benefits, the value for individual users and society

BIX – benchmarking with performance indicators The performance indicators Should have informative content: show whether the service or activity that is „measured“ is „good“ or „bad“ Results should be comparable between libraries of similar structure and clientele The data for the indicators should for the most part be taken from the national library statistics: practical Only a few indicators - but for all services and all stakeholders

Collections of performance indicators for libraries ISO 11620, 2nd ed. (2008), Information and documentation – Library performance indicators (45 indicators) ISO TR (2008), Information and documentation – Performance indicators for national libraries (30 indicators) Poll, R. and te Boekhorst, P., 2nd ed. (2007), Measuring quality, performance measurement in libraries, Saur, München (IFLA Publications 127) (40 indicators)

BIX indicators for public libraries Resources Media in the collection per capita User area in m 2 per 1,000 capita Library staff per 1,000 capita Workstation hours available per capita Internet services Events per 1,000 capita Customer Focus (Use) Visits per capita Loans per capita Collection turnover rate Opening hours per year per 1,000 capita Efficiency Acquisitions budget per loan Employee hours per opening hour Visits per opening hour Total expenditure per visit Development Renewal rate (additions to stock) Training as percentage of all employee hours Investment budget per capita

Indicators for public libraries: Internet services Sum of yes-answers Library website WEB–OPAC Interactive functions (account, reservations, etc.) reference service Electronic collection Active information services (news, events…) Probably most public libraries will have such services

Resources / Infrastructure Square meters of user area Library staff Expenditure on literature and information per 1,000 members of the population Percentage of that expenditure spent on the electronic collection Opening hours per week Academic libraries

Usage Library visits per capita (physical visits) Electronic usage (virtual visits; central counting of homepage and OPAC page visits) User training attendances per 1,000 members of the population Immediate availability (Immediate loans as a percentage of total loans including reservations and ILL) User satisfaction rate (Identical online survey in all libraries, not used at the moment)

Efficiency Library expenditure per active user (acquisitions, material costs, staff) Ratio of acquisitions expenditure to staff costs Efficiency of processes (Example: Media processing. Processed media per FTE staff) Efficiency of processes (Example: Lending. Processed loans and ILL per FTE staff)

Development (Potentials) Days of training per staff member Percentage of the university budget allocated to the library Percentage of library means received through third-party funds, special funds and income generation Percentage of library staff providing and developing electronic services

BIX - A method for counting virtual visits Counting sessions on the homepage and the OPAC start page by a counting pixel downloaded from a central server (Sebastian Mundt, Stuttgart Media University (Sebastian Mundt, Stuttgart Media University The advantage of this method is that it easy to use The disadvantage is that access to other pages is not counted is not counted

Advantages of benchmarking "The value of benchmarking as a proven tool to achieve quality management should be rated very highly indeed" ►Explains the library‘s own results ►Shows „best practice“ ►Detects problems in processes and organisation ►Higher attention for the results in the public, the media and funding institutions ►Higher credibility of the library’s reports

BIX: Survey 2010 “More indicators”, “less indicators” and “anonymous participation” all lower than 40 %. ►1.570 libraries; responses 694 = 44 % ►Which data are most important for your daily work? 1. A practical user satisfaction survey 2. Data showing cost-efficiency 3. Reliable measuring for use of electronic services 4. Data for impact and outcome ►In what direction should BIX develop? 1. More participants (ca. 80 %) 2. No ranking, only groups (ca. 51 %) 3. Additional analysis of the results (ca. 47 %)

BIX: Satisfaction with the system (1 to 5, 1 = best) Results presented as ranking1,9 The set of indicators2,4 Number of participants / option of comparison within my group2,9 Representativeness of the results3,0 Usefulness for external presentation2,2 Internal effect for management2,3 Transparency / tangible calculations and results2,7 Effort of data collection in the library2,3 Organisation of the BIX procedures1,9 Cost-benefit ratio2,2

Play alone and you're bound to win Arabian Proverb