I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries

Slides:



Advertisements
Similar presentations
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Advertisements

Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright.
Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Integrating Library Resources into the Course Development Process at an Online College ©2009 The Sheridan Libraries of The Johns Hopkins University Inspiration,
HR SCORECARD Presented By ADEEL TARIQ MOBASHIR ALI.
Using LibQUAL+ to Develop Accountability with Key Stakeholders Raynna Bowlby Based upon presentation made w/co-presenter Dan O’Mahony (Brown U. Library)
STUDENT SUCCESS CENTERS : WORKING BETTER TOGETHER TO ENSURE STUDENT SUCCESS.
Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
The Value of Academic Libraries Initiative: A Briefing, A Discussion, and An Opportunity for Engagement Lisa Janicke Hinchliffe & Mary Ellen Davis 9 th.
Library Assessment From Measurement to Impact & Value Steve Hiller University of Washington Libraries LAUC-B Conference, October 25, 2013.
Building Effective, Sustainable, and Practical Assessment During Challenging Times Steve Hiller University of Washington Libraries Seattle, USA Martha.
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Using Assessment Data to Improve Library Services Christopher Stewart Dean of Libraries, Illinois Institute of Technology Charles Uth, Head of Collection.
SEM Planning Model.
William Paterson University Five Strategic Areas of Focus at the Cheng Library Fairleigh Dickinson University June 18, 2009 Anne Ciliberti
Evidence-Based Case for University Investment in Libraries During the Great Recession Steve Hiller Director, Assessment and Planning University of Washington.
. The Balanced Scorecard and MIS— Strategy Development and Evolution Jim Self Management Information Services University of Virginia Library 20 th Pan.
CONTEXT Evaluation of Information Services. Topics of Day Mission Vision Goals and Objectives Standards Types of Metrics  Input  Output  Performance.
Reporting to Senior Staff FY13 Planning Task Force User Services Task Force November
Effective, Sustainable and Practical Library Assessment
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library June 17, 2002.
Survey Coordinator Orientation For Technology Provider Surveys California State University IT Operations & Support Services.
From the IT Assessment to the IT Roadmap ( )
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
The Library Balanced Scorecard: The Results Please! Joe Matthews American Library Association June 2007.
P e r f o r m a n c e Measuring Results of Organizational Performance Lesson 4 Performance Methodology: The Balanced Scorecard.
Developing a programme of information literacy. Strategy Will you work at an institutional level? Will you work at a course level? Will you work at a.
Jacqui Dowd SCONUL Workshop University of Westminster 17 th November 2009 Performance Measures & Metrics at University of Glasgow Library.
Assess for Success: Proving Library Value
STUDENT-CENTERED VALUE RESEARCH Assessment activities of the UNT Libraries Sian Brannon, Ph.D. Kathleen Murray, Ph.D. UNT Libraries May 2, 2013.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Library Publishing Services: Strategies for Success Charleston Library Conference November 3, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Collaborative Assessment: Using Balanced Scorecard to Measure Performance and Show Value Liz Mengel, Johns Hopkins University Vivian Lewis, McMaster University.
Outcome Based Evaluation for Digital Library Projects and Services
Making Library Assessment Work ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 Steve Hiller and Jim Self University.
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
Group. “Your partner in developing future Lifelong Learners” UROWNE UNIVERSITY LIBRARY.
UC’s Library Statistics What is UC keeping? (And why?) LAUC-B Conference: Making it Count: Opportunities and Challenges for Library Assessment Joanne Miller.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
The University Library in the Campus Strategic Goals, Initiatives and Metrics Fall 2013.
Ann Campion Riley University of Missouri
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
The Balanced Scorecard
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
Quality Assurance Review Team Oral Exit Report District Accreditation Rapides Parish School District February 2, 2011.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
PROGRAM Perkins III Accountability and Continuous Improvement “Work in Progress” at Minnesota State Colleges and Universities Mary Jacquart Minnesota State.
Forging Forward: Using Evaluation as a Stepping Stone Joe Matthews SLA – San Diego Fall Seminar October 30, 2015.
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
Library Role in Global Health Survey Global Health Vision Task Force April
Faculty information needs: How well do we support the biosciences? 2007 CNI Spring Task Force Meeting Neil Rambo University of Washington Libraries Association.
Using LibQUAL+ to Rethink Public Services June 2003.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Data Collection and Beyond Assessment at the U.Va. Library
Assessing Library Performance:
What is Planning? Start at 9:15—10 minutes to do this. Finish at 9:25.
Presentation transcript:

I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries

I Do Use Research Methods as Part of Our Assessment and Planning Program for: Understanding our user communities – How they work – Their library and information needs – How we can make them successful Organizational improvement – Improving organizational performance, effectiveness and efficiency – Delivering services and programs that make a difference

Assessment More than Numbers Library assessment is a structured process: To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities

Why Assess? Accountability and justification; demonstrating value Improvement of services Comparisons with others Identification of changing patterns Marketing and promotion Opportunity to tell our own story Using data, not assumptions, to make decisions – Assumicide!

What’s Driving the Agenda Environmental Changes – Exploding growth in use and applications of technology – Increased customer expectations for services, including quality and responsiveness – “Competition” from other sources Budgetary Constraints/Reductions – Justification for spending $$$ on libraries – Increasing competition for resources – Budget reductions and reallocations Demonstrating Value – Accountability – How do we enable those in our community to succeed

Traditional Library Measures: Inputs Focus on how big/how much Budget (staff, collections, operations) Staff size Collection size Facilities Other related infrastructure (hours, seats, computers) Size of user communities and programs ARL “Investment Index” measures inputs related to expenditures and staff numbers

Traditional Library Measures: Outputs Focus on usage Collections (print, electronic, ILL) Reference services Facilities (gate counts) Instruction sessions Discovery and retrieval Other Web sessions May indicate if “inputs” are used, but doesn’t tell us what users were able to accomplish as a result.

These Are Self-Reported Statistics Too!

The Challenge for Libraries Traditional statistics are no longer sufficient – Emphasize inputs/outputs – how big and how many – Do not tell the library’s or customers’ story – May not align with organizational goals and plans – Do not measure service quality or library impact Need better outcome measures that demonstrate difference the library makes and value it adds – To the individual, community and the organization “No longer what makes a good library but how much good does the library do” (Peter Brophy)

Assessing and Demonstrating the Library Contribution to the Institutional Mission The library’s contribution to learning and research – Student learning (accreditation driven) – Externally funded research and scholarship Value of the library to the community – Information resources/collections – Library as place – Current services Changes in library and information needs and use Organizational performance and effectiveness Collaborations

Good Assessment Starts Before You Begin... Some Questions to Ask Define the question – What’s Important – What do you need to know, why, and when How will you use the information/results Where/how will you get the information – Methods used – Existing data – New data (where or who will you get it from) How will you analyze the information Who will act upon the findings

Four Useful Assessment Assumptions Your problem/issue is not as unique as you think You have more data/information than you think You need less data/information than you think There are useful methods that are much simpler than you think Adapted from Douglas Hubbard, “How to Measure Anything” (2007)

Documenting Library Performance and Impact Common library assessment methods – Surveys (satisfaction, needs, importance) – Usage and other library statistics – Qualitative information (interviews, focus groups, etc.) Other statistical data – Institutional – Comparator (ARL, ACRL, peer groups, customized) – Government (NCES) Collaborations Value, Impact and Return on Investment – Lib-Value (IMLS grant to measure value and return on investment in academic libraries)

Choosing the Right Assessment Method Criteria Utility Relevance/Importance Stakeholder needs Measurability Cost Timely Tools Usage data Surveys (local & standardized) Standardized tests Performance assessments Qualitative methods Rubrics

Presenting Assessment Findings Make sure data/results are: – Timely – Understandable – Usable Identify important findings/key results – What’s important to know – What’s actionable Present key/important results to: – Library administration/institutional administration – Library staff – Other libraries/interested parties/stakeholders

Success with Assessment Use multiple assessment methods Mine/repurpose existing data Invest in staff training and resources Focus on the customer and community Learn from our users Partner with other campus programs and institutions Present assessment information so that it is understandable and usable

A Skeptical View of Metrics

Association of Research Libraries (ARL) and Library Assessment ARL has played a major role in advancing assessment in academic libraries through: ARL Statistics New measures and standardized methods – LibQUAL+® user survey, MINES for libraries Individual library consulting – Effective, Sustainable and Practical Assessment (ESP) 42 libraries visited to evaluate assessment needs and programs Library Assessment Conference

ESP Insights Uncertainty on how to establish and sustain assessment Staff lack essential assessment/data analysis skills and knowledge Lack of focus and assessment priorities; tenuous link to planning and decision making Underutilization of campus assessment resources More data collection than data utilization Overreliance on surveys for user input Organizational issues play a significant role in sustainable assessment

From Institutional Based Assessment to a Community of Practice THE NEED Bring together library folks interested in assessment Focus on effective and practical assessment Establish an ongoing venue for presentation of library assessment issues, activities and results Build a continuing education component (workshops) Make it fun! AN ANSWER Library Assessment Conference – Organized by ARL, U.Va and UW – Biennial conference first held in 2006

Library Assessment Conference Facts Year Conference Site # Proposals# Conference Presentations # WorkshopsTotal registrants 2006 Charlottesville 80 paper/panel 15 poster 38 paper/panel 20 posters 3 plenary Proceedings – 3 lbs. 3 half-day (each repeated) 120 participants turnaways 2008 Seattle 95 paper/panel 38 poster 59 paper/panel 43 posters 5 plenary Proceedings - 4 lbs. 6 half-day 160 participants turnaways 2010 Baltimore 154 papers 55 posters 63 papers 80 posters 5 plenary 2 full-day 4 half-day 160 participants turnaways

Using Assessment for Results at the University of Washington or How We Contribute to User Success Assessment program established in 1991 – Focus on user needs – Information seeking behavior and use – Patterns of library use – Library contribution to learning and research – User satisfaction with services, collections, overall Increasingly tied to strategic goals and priorities Provides data to improve programs and services and to demonstrate the library contribution to user success

University of Washington Libraries Assessment Methods Used Large scale user surveys every 3 years since 1992 (“triennial survey”) In-library use surveys every 3 years beginning 1993 Focus groups/Interviews User centered design Observation (guided and non-obtrusive) Usability Usage statistics/data mining Information about assessment program available at:

UW Libraries Triennial Survey Started in 1992 with paper; web-based began in 2004 Survey designed by library staff and asks about needs, importance, satisfaction, use patterns, and impact (comments valuable too) Survey all faculty and a sample of students Survey for each group is different and survey questions may change over time (although a core set remains the same over time and between groups) Survey can help measure effectiveness of existing programs and provide direction for future ones Longest running cyclical survey in academic libraries

Strategic Priorities Expand digital and physical delivery services Enhance library contributions to research productivity Raise visibility and effectiveness of librarian liaisons Inform UW researchers/authors about good scholarly communications practices Strengthen library role in undergraduate learning Reshape library spaces to enhance user experiences Ensure content needed is accessible and deliverable Implement new models of service

What We Did Began pull and scan service; harmonized ILL Implemented UW WorldCat as primary access point Articulated service expectations for librarian liaisons Expanded scholarly communication efforts Began revisioning process for undergrad library space Brought in consultant on teaching and learning Participated in ARL Library Scorecard Pilot (2009-) % budget reduction – Closed several branch libraries; cut hours; cut 29 positions in 2009 – Reduced collections budget; cut serial subscriptions

Libraries 2010 Triennial Survey Highlights Record number of faculty and graduate student responses Satisfaction ratings highest ever for faculty and grads; slightly lower for undergrads (at all 3 campuses) Library contributions to teaching, learning, research and overall success rated very high by faculty/grad students Substantial increase in use and satisfaction with library delivery services (ILL, pull and scan) Online access to and delivery of scholarly information, especially journals, are driving research and scholarship

UW Libraries Triennial Survey Number of Respondents and Response Rate Faculty % % % % % % % Grad/Prof Students (UWS) % % % % % % % Undergrads (UWS) % % % % % % %

Overall Satisfaction by Group

Library Services and Resources: Overall Importance to Work by Group ( Scale of 1 “Not Important” to 5 “Very Important)

UW Libraries 2010 Triennial Survey Libraries Contribution to: (Scale of 1 “Minor” to 5 “Major”) Mean scores %= those marking 4 or 5 Faculty 1634 surveys (39% response) Graduate Students 680 surveys (32% response) Keeping current in your field 92% % 4.53 Finding information in related fields or new areas 90% % 4.57 Being a more productive researcher 92% % 4.64 Enriching student learning experiences Overall academic success 77% % 4.60 Making more efficient use of your time 87% % 4.21

Importance of Books & Journals by Academic Area (2010, Faculty, Scale of 1 “not important” to 5 “very important)

Importance of Books & Older Journals by School

Services Satisfaction and Visibility by Group 2007/ Satisfaction 2010 Visibility 2007 Satisfaction 2007 Visibility Instruction - S is up, V is down Faculty Grad Undergrad (usefulness) % 42% 44% % 55% 49% Staff assistance - S is up, V is the same Faculty Grad Undergrad % % 75% 69% ILL Books and Journals - S is up, V is up Faculty Grad Undergrad % 81% 57% % 61% 46% Remote access to collect/services - ALL GOOD! Faculty Grad Undergrad % 89%

Subject Librarian Visibility and Satisfaction By Faculty College/School (Balanced Scorecard Metric)

Use Patterns: Frequency of In-Library Visits (Weekly or more often)

Undergraduate Overall Satisfaction

Undergrad Satisfaction With Facilities

80% of the 400 comments from UWS Undergrads Dealt with Space and Hours Open is one thing, space and available computers / tables with laptop plug-ins is whole other issue More seating or computer areas, engineer a reduced noise level in Odegaard. 1. More space between the computers 2. More quiet study areas 3. Spaces to eat, drink and take breaks Suzzallo-Allen. Quiet, neat, clean, cool, beautiful, access to everything I need. Ode, on the other hand...

What People Do in Libraries by Group In-Library Use Survey: 73% UG, 22% Grad, 5% Faculty

Other Relevant Data During the past five years at UWS: Total number weekly hours libraries open dropped 26% Number of library seats dropped 3% Enrolment increased by 6% Gate counts increased by 6% or 250,000 more entrants

How UW Libraries Has Used Assessment A Few Examples Extend hours in Undergraduate Library (24/5.5) Create more diversified student learning spaces Enhance usability of discovery tools and website Provide standardized service training for all staff Review and restructure librarian liaison program Consolidate and merge branch libraries Change/reallocate collections budget Change/reallocate staffing Support budget requests to University

Integrated Organizational Performance Model The Balanced Scorecard A model for measuring organizational performance developed in the 1990’s by Kaplan and Norton that: – Helps identify the important statistics – Helps ensure a proper balance – Organizes multiple statistics into an intelligible framework Clarifies and communicates the organization’s vision Provides a structured metrics framework for aligning assessment with strategic priorities & evaluating progress ARL Library Scorecard Pilot in with 4 libraries – Johns Hopkins, McMaster, Virginia, Washington

Goals of the ARL Pilot Evaluate the Balanced Scorecard a suitable performance model for academic research libraries Value as structured process to better integrate and strengthen strategy, planning and assessment Encourage cross-library collaboration Review objectives and measures for commonalities between libraries

Closing the Loop: Success with Assessment Assess what is important Keep expectations reasonable and achievable Use multiple assessment methods; corroborate Mine/repurpose existing data Focus on users; how they work, find & use information Use the data to improve and add customer value Keep staff, customers and stakeholders involved and informed

Eye to the Future Measuring performance is an exercise in measuring the past. It is the use of that data to plan an improved future that is all important. Peter Brophy Data trends can inform the future Strategic planning can frame the future Organizational performance models can align ongoing operations with future aspirations Understanding how customers work, how that work is changing, and ways we can make customers and institutions success are key to the future of libraries

In Conclusion Can You Answer These Questions? What do we know about our communities and customers to provide services and resources to make them successful? How do we measure the effectiveness of our services, programs and resources from the customer perspective? What do our stakeholders need to know in order to provide the resources needed for a successful library?