Www.arl.org Making Library Assessment Work ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 Steve Hiller and Jim Self University.

Slides:



Advertisements
Similar presentations
By Jim Martz, Associate Superintendent of Schools, Illinois Conference of SDA By Jim Martz, Associate Superintendent of Schools, Illinois Conference.
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
But What Does It Mean ? Using Statistical Data for Decision Making in Academic Libraries Steve Hiller University of Washington Libraries Seattle, Washington.
Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright.
Using LibQUAL+ to Develop Accountability with Key Stakeholders Raynna Bowlby Based upon presentation made w/co-presenter Dan O’Mahony (Brown U. Library)
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
the UNCG University Libraries ASERL Meeting November 13, 2012 Atlanta, GA Kathryn Crowe Associate Dean for Public Services
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Bound for Disappointment Faculty and Journals at Research Institutions Jim Self University of Virginia Library USA 7 th Northumbria Conference Spier, South.
 Reading School Committee January 23,
SUPERINTENDENT AND BOARD OF EDUCATION MEMBER PERCEPTIONS REGARDING PREFERRED LEADERSHIP BEHAVIORS FOR SUPERINTENDENTS IN WEST VIRGINIA Keith A. Butcher.
Changes in Library Usage, Usability, & User Support Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie.
. The Balanced Scorecard and MIS— Strategy Development and Evolution Jim Self Management Information Services University of Virginia Library 20 th Pan.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
CONTEXT Evaluation of Information Services. Topics of Day Mission Vision Goals and Objectives Standards Types of Metrics  Input  Output  Performance.
Effective, Sustainable and Practical Library Assessment
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library June 17, 2002.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries
Making Library Assessment Work: Practical Approaches for Developing and Sustaining Effective Assessment Phase I Update ARL VISITING PROGRAM OFFICERS
Measures that matter: Designing and developing your own Balanced Scorecard Karin de Jager & Jim Self 7th Northumbria International Conference on Performance.
Jacqui Dowd SCONUL Workshop University of Westminster 17 th November 2009 Performance Measures & Metrics at University of Glasgow Library.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Making Library Assessment Work Progress Report of an ARL Project Steve Hiller University of Washington Jim Self University of Virginia & Martha Kyrillidou.
Five Years of Keeping Score What are the Results? Jim Self Donna Tolson University of Virginia Library ALA Annual Conference Washington DC June 23, 2007.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library 27 June 2004.
Collaborative Assessment: Using Balanced Scorecard to Measure Performance and Show Value Liz Mengel, Johns Hopkins University Vivian Lewis, McMaster University.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
Effective, Sustainable & Practical A ssessment Steve Hiller, Director, Assessment and Planning, UW Martha Kyrillidou, Statistics and Service Quality Programs,
Leadership Team Meeting March 24,  Project Based Approach  Cross Functional Project Teams  Projects Support Multiple Operational Expectations.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
January 18, 2012 Administrative Council Presentation.
Margaret Martin Gardiner Assessment Librarian The University of Western Ontario LibQUAL+2007 Canada 25 October 2007.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
Library Assessment: Why Today and not Tomorrow Library Assessment Thessaloniki, Greece June 13-15, 2005 Martha Kyrillidou, Association of Research.
Introduction – Addressing Business Challenges Microsoft® Business Intelligence Solutions.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
Measurement Systems. Development of Information Information is necessary for both control and improvement Information derives from analysis of data Data,
ELearning Committee Strategic Plan, A Brief History of the ELC Committee Developed and Charged (2004) CMS Evaluation and RFP Process (2004)
Measuring the impact of Technology on Quality of Services and Operations in an Academic Library Ashok Kumar Sahu Senior Librarian, IIMT Gulam Rasul Asst.
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
The New Metrics at U.Va. Jim Self University of Virginia Library ARL Survey Coordinators Meeting Orlando, Florida June 25, 2004.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Lead Agency Viability Assessment Consistent with OPPAGA Report 04-65, DCF contracted with FMHI to assist in the design and implementation of a centralized.
March 14, 2009 ACRL 14 th National Conference Seattle, WA ClimateQUAL™: Organizational Climate and Diversity Assessment Presented by Martha Kyrillidou,
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
Indiana University Kokomo Strategic Enrollment Management Consultation Final Report Bob Bontrager December 8, 2007.
A Presentation for the Annual Conference of the Missouri Community College Association November 6, 2003 Larry McDoniel Ann Campion Riley Assessment of.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Today’s Agenda The importance of a conversation
Strategic Planning Goals
Data Collection and Beyond Assessment at the U.Va. Library
Assessing Library Performance:
Assessing the Assessment Tool
Presentation transcript:

Making Library Assessment Work ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 Steve Hiller and Jim Self University of Washington and University of Virginia Association of Research Libraries

Why Assess? Accountability and justification Improvement of services Comparison with others Identification of changing patterns Identification of questionable services Marketing and promotion

Good assessment practices Focus on the user Diverse samples of users Fair and unbiased queries Measurable results Criteria for success Qualitative and quantitative techniques Corroboration

Assessment is not… Quick and easy Free and easy A one-time effort A complete diagnosis A roadmap to the future

“…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931

What Does it Mean? Understanding Your Data Scan results for basic overview – Frequencies, means, patterns, variation Use statistical analysis that make sense Qualitative information and comparisons provide context and understanding Seek internal or external validation – Within same data sets or others Identify what is important and why

Communicating and Using Results Identify key findings, not all results Mix text, data, and graphics – avoid jargon – add context Know your audiences. Make it understandable Prioritize potential action items and follow-up Identify “Handoffs” to those responsible for action Look for some easy “wins” – Quick, inexpensive, and noticeable Report results

Effective Assessment Easier Said Than Done Libraries in many cases are collecting data without really having the will, organizational capacity, or interest to interpret and use the data effectively in library planning. The profession could benefit from case studies of those libraries that have conducted research efficiently and applied the results effectively. (Denise Troll Covey, Usage and Usability Assessment: Practices and Concerns, 2002)

Two Approaches to Assessment University of Washington – User needs assessment – Large-scale cyclical surveys and ongoing qualitative input – Assessment distributed throughout organization University of Virginia – Performance and financial standards – Compilation of data from varied sources – Centralized Management Information Services unit

UW Libraries Assessment Organization Library Assessment Coordinator (50%) – Chairs Library Assessment Group (9 members) – Coordinates and develops broad-based user needs assessment efforts (surveys, focus groups, observation) – Encourages and supports other assessment work Shared and Distributed Assessment Activities Usability (Web Services) E-Metrics (Assessment, Collection Management Services) Management information (Assessment, Budget, CMS) Instruction (Information Literacy, Assessment) Digital Library (Digital initiatives, Public Services, Assessment)

UW Assessment Methods Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004 In-library use surveys every 3 years beginning 1993 LibQUAL+™ in 2000, 2001, 2002, 2003 Focus groups on varied topics (annually since 1998) Observation (guided and non-obtrusive) Usability E-Metrics

Growing Assessment at UW From Project-Based to Ongoing and Sustainable Libraries’ first strategic plan in 1991 called for survey as part of user-centered services philosophy – Initial large scale library survey done in 1992 as “one-time project” Library Services Committee formed in 1993 – Conducted in-library use surveys in 1993,1996, triennial survey in 1995 Library Assessment Group appointed in 1997 – Focus groups, observation studies, in-library and triennial surveys Collection Management Services, 1997 – E-Metrics and collections use Library systems, 1997 – Usability, Web usage logs Library Assessment Coordinator (50%) appointed 1999

UW Assessment Priorities Information seeking behavior and use Library use patterns Library importance and impact User priorities for the library User satisfaction with services, collections, overall Using data to make informed decisions that lead to library improvement

How UW Has Used Assessment Information Understand that different academic groups have different needs Make our physical libraries “student” places Identify student information technology needs Move to desktop delivery of resources Enhance resource discovery tools Provide standardized service training for all staff Stop activities that do not add value to users Consolidate and merge branch libraries

Branch Library Consolidation A UW Case Study 3 Social Science libraries consolidated in 1994 – Described in ARL SPEC Kit Review Committee formed in 2002 Changing use patterns a bigger driver than budget Review to be objective and data-based Identify one library to be consolidated into main library

Performance Measures to Assess Branch Library Viability Use – Print items, photocopies, reference questions, gate counts Primary user population – Number of faculty and students, change over time Facility quality – For users, collections, and staff Physical library dependency of primary users

Data Sources Used Library generated use data (including trend data) Electronic resources use data supplied by vendors University enrolment data (including trend data) Interviews, focus groups, survey comments Facility data Survey data – Triennial survey – In-library use Cost data

Primary User Groups by Branch Library (2003 University data)

Facility Space Quality: Methodology Discussed facility issues with unit staff Reviewed user survey comments from 2001 and 2002 Used previous focus group data for fine arts libraries Developed list of criteria A team of 3 walked through each unit A second walk through was conducted 2 months later Each member of the team assigned a score of 1 to 5 for quality of staff, collections, and user spaces. Scores were compared and made consistent.

Facility Quality

Science Faculty Libraries Used Regularly (2001 Survey) College/Dept’s (Faculty Responses) “HOME” Library Natural Sciences/Allen Other Chemistry (36) 72%33%28% (Physics) Fish-Ocean (42) 69%45%10% (Engineering) Forest Resources (28) 79%57%14% (Engineering) Math-Stat (27) 93%30%7% (Engineering) Physics-Astronomy (36) 78%25%22% (Chemistry)

Forestry Faculty and Grad Students Frequency of Library Use ModeGroup x per week x per week x+ per week x+ per week Visit in Person FAC GRAD 29% 43% 24% 18% 0% 23% Campus Computer FAC GRAD 46% 53% 30% 18% 32% 35% 52% 71% Off-Campus Computer FAC GRAD 36% 24% 35% 53% 0% 12% 13% 24%

Merger Time Line Review Group formed Spring 2002 Recommendations submitted February 2003 – Merger of Forest Resources Library – Identification of two other libraries for later merger Recommendation accepted June 2003 Joint implementation team appointed September 2003 Forestry faculty and students surveyed and presentation made February 2004 Forest Resources Library merged into Natural Sciences Library August 2004

Triennial Survey Spring 2004 Satisfaction All Faculty and Forestry Faculty (1998, 2001, 2004)

The University of Virginia Library Organizational Culture Customer Service Collecting and using data Innovation Flexibility Learning and development Participation and discussion Pride

In the words of our leader… Use data to IMPROVE Services Collections Processes Performance Etc., etc. Don’t use data to preserve the status quo -Karin Wittenborg University Librarian, University of Virginia June 24, 2004

University of Virginia Library Organizing for Assessment Management Information Services unit – Established in 1996 – Currently 3 staff – Resource for library management and staff – Advocates for sustainable assessment Centralized data collection, analysis and compilation Multifaceted approaches

Collecting the Data at U.Va. Customer Surveys Staff Surveys Mining Existing Records Comparisons with peers Qualitative techniques

Customer Surveys Faculty – 1993, 1996, 2000, 2004 Students – 1994, 1998, 2001, 2005 – Separate analyses for grads and undergrads

Faculty Priorities 1993 to 2004

Using Customer Survey Results – UVa Additional resources for the science libraries (1994+) Major renovation (2001) Revision of library instruction for first year students (1995) Redefinition of collection development (1996) Initiative to improve shelving (1999) Undergraduate library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support transition from print to electronic journals (2004)

Staff Surveys Internal Customer Service – 2002, 2003, 2004 – 1 to 5 satisfaction scale Worklife Survey – 2004 – Agree or disagree with positive statements

Internal Customer Service Surveys Ratings (1 to 5) of units providing service to other library staff Reports to managers and administrators Anonymous structured interviews to follow up Survey expanded in 2004 to include all library departments

Worklife Survey Areas of inquiry – Job Satisfaction – Interpersonal Relations – Communications & Collaborations – Diversity – Resource Availability – Staff Development – Health & Safety Report at Library ‘Town Meeting’ Focus groups following up

Data Mining Acquisitions Circulation Finance University Records

Acquisitions Expenditures by Format University of Virginia Library

University of Virginia Library Serving the Customer

University of Virginia Library Serving the Customer

Comparisons with Peers Within the University Within ARL

Expenditures of UVA Academic Division 1989 — 2003 Other Academic Support (+200%) Research (+219%) Total Academic Division (+140%) Libraries (+81%) Instruction (+80%)

Median Faculty Salaries University of Virginia Library Compared to ARL Median

Qualitative Techniques Focus Groups – Preparation for work life survey – Follow up to work life survey Structured Interviews – Anonymous follow-up to customer service survey Open Discussions

Corroboration Data are more credible if they are supported by other information John Le Carre’s two proofs

Analyzing Survey Results Two Scores for Resources, Services, Facilities – Satisfaction = Mean Rating (1 to 5) – Visibility = Percentage Answering the Question Permits comparison over time and among groups Identifies areas that need more attention

UVa Reference Activity and Reference Visibility in Student Surveys

The Balanced Scorecard Managing and assessing data The Balanced Scorecard is a layered and categorized instrument that – Identifies the important statistics – Ensures a proper balance – Organizes multiple statistics into an intelligible framework

The scorecard measures are “balanced” into four areas The user perspective The finance perspective The internal process perspective The future (learning and growth) perspective

Metrics Specific targets indicating full success, partial success, and failure At the end of the year we know if we have met our target for each metric The metric may be a complex measure encompassing several elements

Rationale for the BSC: Getting Control of the Data Focus Balance Assessment Intelligibility

The BSC at the U.Va. Library Implemented in 2001 Reports for FY2002 and FY2003 Reporting results for FY2004 Metrics for FY2005 in place A work in progress

Metric L.1.A. Impact of training. Target1: : Positive scores (4 or 5) on training statements from 80% of respondents in the annual work-life survey. Target2: : Positive scores (4 or 5) on training statements from 60% of respondents in the annual work-life survey.

Metric I.2.A. Internal Communications Target1: : Positive scores (4 or 5) on internal communications statements from 80% of respondents in the annual work-life survey. Target2: : Positive scores (4 or 5) on internal communications statements from 60% of respondents in the annual work-life survey.

Metric I.3.B. Staff Rating of Internal Services Target1: : A composite rating of 4.00 in the annual internal services survey, with no unit rated below Target2: : A composite rating of 3.50, with no unit rated below 3.00.

Metric L.2.A. Job Satisfaction. Target1: : Positive scores (4 or 5) on job satisfaction statements from 80% of respondents in the annual work-life survey. Target2: : Positive scores (4 or 5) on job satisfaction statements from 60% of respondents in the annual work-life survey.

Metric L.2.B. Retention Rate of Employees Target1: : Retain 95% of employees. Target2: : Retain 90% of employees.

Metric L.2.C. Compare Salaries to Peer Groups Target1: : The median faculty salary at the U.Va. Library should rank in the upper 40% of all ARL university libraries. Target2: : The median faculty salary at the U.Va. Library should rank in the upper 50% of all ARL university libraries.

Metric L.2.D. Diversity of Staff Target1: : A net increase of at least 4 in faculty/staff diversity, with a net increase of at least 2 in faculty diversity. Target2: : A net increase of at least 2 in faculty/staff diversity, with a net increase of at least 1 in faculty diversity.

Metric L.4.A. Develop a Culture of Assessment Target1: : 75% of respondents score 12 or more positive responses on the Culture of Assessment IQ instrument. Target2: : 50% of respondents score 12 or more positive responses.

Moving Forward Understand your limitations – Use data wisely and appropriately – Don’t do more than you can support or utilize Don’t expect perfection; strive for accuracy and honesty Assess what is important to the library and the staff, Use the data to improve But always keep your focus on the user

For more information… Steve Hiller Jim Self – – ARL Assessment Project old.libqual.org/documents/admin/VPOHillerSelf.pdf