Presentation is loading. Please wait.

Presentation is loading. Please wait.

I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries

Similar presentations


Presentation on theme: "I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries"— Presentation transcript:

1 I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries hiller@u.washington.edu

2 I Do Use Research Methods as Part of Our Assessment and Planning Program for: Understanding our user communities – How they work – Their library and information needs – How we can make them successful Organizational improvement – Improving organizational performance, effectiveness and efficiency – Delivering services and programs that make a difference

3 Assessment More than Numbers Library assessment is a structured process: To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities

4 Why Assess? Accountability and justification; demonstrating value Improvement of services Comparisons with others Identification of changing patterns Marketing and promotion Opportunity to tell our own story Using data, not assumptions, to make decisions – Assumicide!

5 What’s Driving the Agenda Environmental Changes – Exploding growth in use and applications of technology – Increased customer expectations for services, including quality and responsiveness – “Competition” from other sources Budgetary Constraints/Reductions – Justification for spending $$$ on libraries – Increasing competition for resources – Budget reductions and reallocations Demonstrating Value – Accountability – How do we enable those in our community to succeed

6 Traditional Library Measures: Inputs Focus on how big/how much Budget (staff, collections, operations) Staff size Collection size Facilities Other related infrastructure (hours, seats, computers) Size of user communities and programs ARL “Investment Index” measures inputs related to expenditures and staff numbers

7 Traditional Library Measures: Outputs Focus on usage Collections (print, electronic, ILL) Reference services Facilities (gate counts) Instruction sessions Discovery and retrieval Other Web sessions May indicate if “inputs” are used, but doesn’t tell us what users were able to accomplish as a result.

8 These Are Self-Reported Statistics Too!

9 The Challenge for Libraries Traditional statistics are no longer sufficient – Emphasize inputs/outputs – how big and how many – Do not tell the library’s or customers’ story – May not align with organizational goals and plans – Do not measure service quality or library impact Need better outcome measures that demonstrate difference the library makes and value it adds – To the individual, community and the organization “No longer what makes a good library but how much good does the library do” (Peter Brophy)

10 Assessing and Demonstrating the Library Contribution to the Institutional Mission The library’s contribution to learning and research – Student learning (accreditation driven) – Externally funded research and scholarship Value of the library to the community – Information resources/collections – Library as place – Current services Changes in library and information needs and use Organizational performance and effectiveness Collaborations

11 Good Assessment Starts Before You Begin... Some Questions to Ask Define the question – What’s Important – What do you need to know, why, and when How will you use the information/results Where/how will you get the information – Methods used – Existing data – New data (where or who will you get it from) How will you analyze the information Who will act upon the findings

12 Four Useful Assessment Assumptions Your problem/issue is not as unique as you think You have more data/information than you think You need less data/information than you think There are useful methods that are much simpler than you think Adapted from Douglas Hubbard, “How to Measure Anything” (2007)

13 Documenting Library Performance and Impact Common library assessment methods – Surveys (satisfaction, needs, importance) – Usage and other library statistics – Qualitative information (interviews, focus groups, etc.) Other statistical data – Institutional – Comparator (ARL, ACRL, peer groups, customized) – Government (NCES) Collaborations Value, Impact and Return on Investment – Lib-Value (IMLS grant to measure value and return on investment in academic libraries)

14 Choosing the Right Assessment Method Criteria Utility Relevance/Importance Stakeholder needs Measurability Cost Timely Tools Usage data Surveys (local & standardized) Standardized tests Performance assessments Qualitative methods Rubrics

15 Presenting Assessment Findings Make sure data/results are: – Timely – Understandable – Usable Identify important findings/key results – What’s important to know – What’s actionable Present key/important results to: – Library administration/institutional administration – Library staff – Other libraries/interested parties/stakeholders

16 Success with Assessment Use multiple assessment methods Mine/repurpose existing data Invest in staff training and resources Focus on the customer and community Learn from our users Partner with other campus programs and institutions Present assessment information so that it is understandable and usable

17 A Skeptical View of Metrics

18 Association of Research Libraries (ARL) and Library Assessment ARL has played a major role in advancing assessment in academic libraries through: ARL Statistics New measures and standardized methods – LibQUAL+® user survey, MINES for libraries Individual library consulting – Effective, Sustainable and Practical Assessment (ESP) 42 libraries visited 2005-2010 to evaluate assessment needs and programs Library Assessment Conference

19 ESP Insights Uncertainty on how to establish and sustain assessment Staff lack essential assessment/data analysis skills and knowledge Lack of focus and assessment priorities; tenuous link to planning and decision making Underutilization of campus assessment resources More data collection than data utilization Overreliance on surveys for user input Organizational issues play a significant role in sustainable assessment

20 From Institutional Based Assessment to a Community of Practice THE NEED Bring together library folks interested in assessment Focus on effective and practical assessment Establish an ongoing venue for presentation of library assessment issues, activities and results Build a continuing education component (workshops) Make it fun! AN ANSWER Library Assessment Conference – Organized by ARL, U.Va and UW – Biennial conference first held in 2006

21 Library Assessment Conference Facts Year Conference Site # Proposals# Conference Presentations # WorkshopsTotal registrants 2006 Charlottesville 80 paper/panel 15 poster 38 paper/panel 20 posters 3 plenary Proceedings – 3 lbs. 3 half-day (each repeated) 120 participants 220 30 turnaways 2008 Seattle 95 paper/panel 38 poster 59 paper/panel 43 posters 5 plenary Proceedings - 4 lbs. 6 half-day 160 participants 375 15 turnaways 2010 Baltimore 154 papers 55 posters 63 papers 80 posters 5 plenary 2 full-day 4 half-day 160 participants 475 20 turnaways

22 Using Assessment for Results at the University of Washington or How We Contribute to User Success Assessment program established in 1991 – Focus on user needs – Information seeking behavior and use – Patterns of library use – Library contribution to learning and research – User satisfaction with services, collections, overall Increasingly tied to strategic goals and priorities Provides data to improve programs and services and to demonstrate the library contribution to user success

23 University of Washington Libraries Assessment Methods Used Large scale user surveys every 3 years since 1992 (“triennial survey”) In-library use surveys every 3 years beginning 1993 Focus groups/Interviews User centered design Observation (guided and non-obtrusive) Usability Usage statistics/data mining Information about assessment program available at: http://www.lib.washington.edu/assessment/

24 UW Libraries Triennial Survey Started in 1992 with paper; web-based began in 2004 Survey designed by library staff and asks about needs, importance, satisfaction, use patterns, and impact (comments valuable too) Survey all faculty and a sample of students Survey for each group is different and survey questions may change over time (although a core set remains the same over time and between groups) Survey can help measure effectiveness of existing programs and provide direction for future ones Longest running cyclical survey in academic libraries

25 Strategic Priorities 2007-2010 Expand digital and physical delivery services Enhance library contributions to research productivity Raise visibility and effectiveness of librarian liaisons Inform UW researchers/authors about good scholarly communications practices Strengthen library role in undergraduate learning Reshape library spaces to enhance user experiences Ensure content needed is accessible and deliverable Implement new models of service

26 What We Did 2007-2009 Began pull and scan service; harmonized ILL Implemented UW WorldCat as primary access point Articulated service expectations for librarian liaisons Expanded scholarly communication efforts Began revisioning process for undergrad library space Brought in consultant on teaching and learning Participated in ARL Library Scorecard Pilot (2009-) 2009 - 12% budget reduction – Closed several branch libraries; cut hours; cut 29 positions in 2009 – Reduced collections budget; cut serial subscriptions

27 Libraries 2010 Triennial Survey Highlights Record number of faculty and graduate student responses Satisfaction ratings highest ever for faculty and grads; slightly lower for undergrads (at all 3 campuses) Library contributions to teaching, learning, research and overall success rated very high by faculty/grad students Substantial increase in use and satisfaction with library delivery services (ILL, pull and scan) Online access to and delivery of scholarly information, especially journals, are driving research and scholarship

28 UW Libraries Triennial Survey Number of Respondents and Response Rate 1992-2010 http://www.lib.washington.edu/assessment/ http://www.lib.washington.edu/assessment/ 2010200720042001199819951992 Faculty1634 39% 1455 36% 1560 40% 1345 36% 1503 40% 1359 31% 1108 28% Grad/Prof Students (UWS) 640 32% 580 33% 627 40% 597 40% 457 46% 409 41% 560 56% Undergrads (UWS) 365 16% 467 20% 502 25% 497 25% 787 39% 463 23% 407 41%

29 Overall Satisfaction by Group 1995-2010

30 Library Services and Resources: Overall Importance to Work by Group ( Scale of 1 “Not Important” to 5 “Very Important)

31 UW Libraries 2010 Triennial Survey Libraries Contribution to: (Scale of 1 “Minor” to 5 “Major”) Mean scores %= those marking 4 or 5 Faculty 1634 surveys (39% response) Graduate Students 680 surveys (32% response) Keeping current in your field 92% 4.6790% 4.53 Finding information in related fields or new areas 90% 4.5691% 4.57 Being a more productive researcher 92% 4.6393% 4.64 Enriching student learning experiences Overall academic success 77% 4.18 92% 4.60 Making more efficient use of your time 87% 4.4680% 4.21

32 Importance of Books & Journals by Academic Area (2010, Faculty, Scale of 1 “not important” to 5 “very important)

33 Importance of Books & Older Journals by School

34 Services Satisfaction and Visibility by Group 2007/2010 2010 Satisfaction 2010 Visibility 2007 Satisfaction 2007 Visibility Instruction - S is up, V is down Faculty Grad Undergrad (usefulness) 4.45 4.20 3.36 34% 42% 44% 4.27 3.80 3.21 52% 55% 49% Staff assistance - S is up, V is the same Faculty Grad Undergrad 4.48 4.30 4.04 75% 4.42 4.06 3.94 76% 75% 69% ILL Books and Journals - S is up, V is up Faculty Grad Undergrad 4.44 4.45 4.06 77% 81% 57% 4.25 4.19 3.90 63% 61% 46% Remote access to collect/services - ALL GOOD! Faculty Grad Undergrad 4.64 4.65 4.20 90% 89%

35 Subject Librarian Visibility and Satisfaction By Faculty College/School (Balanced Scorecard Metric)

36 Use Patterns: Frequency of In-Library Visits 1998-2010 (Weekly or more often)

37 Undergraduate Overall Satisfaction 2007-2010

38 Undergrad Satisfaction With Facilities

39 80% of the 400 comments from UWS Undergrads Dealt with Space and Hours Open is one thing, space and available computers / tables with laptop plug-ins is whole other issue More seating or computer areas, engineer a reduced noise level in Odegaard. 1. More space between the computers 2. More quiet study areas 3. Spaces to eat, drink and take breaks Suzzallo-Allen. Quiet, neat, clean, cool, beautiful, access to everything I need. Ode, on the other hand...

40 What People Do in Libraries by Group 2008 2008 In-Library Use Survey: 73% UG, 22% Grad, 5% Faculty

41 Other Relevant Data During the past five years at UWS: Total number weekly hours libraries open dropped 26% Number of library seats dropped 3% Enrolment increased by 6% Gate counts increased by 6% or 250,000 more entrants

42 How UW Libraries Has Used Assessment A Few Examples Extend hours in Undergraduate Library (24/5.5) Create more diversified student learning spaces Enhance usability of discovery tools and website Provide standardized service training for all staff Review and restructure librarian liaison program Consolidate and merge branch libraries Change/reallocate collections budget Change/reallocate staffing Support budget requests to University

43 Integrated Organizational Performance Model The Balanced Scorecard A model for measuring organizational performance developed in the 1990’s by Kaplan and Norton that: – Helps identify the important statistics – Helps ensure a proper balance – Organizes multiple statistics into an intelligible framework Clarifies and communicates the organization’s vision Provides a structured metrics framework for aligning assessment with strategic priorities & evaluating progress ARL Library Scorecard Pilot in 2009-10 with 4 libraries – Johns Hopkins, McMaster, Virginia, Washington

44 Goals of the ARL Pilot Evaluate the Balanced Scorecard a suitable performance model for academic research libraries Value as structured process to better integrate and strengthen strategy, planning and assessment Encourage cross-library collaboration Review objectives and measures for commonalities between libraries

45

46 Closing the Loop: Success with Assessment Assess what is important Keep expectations reasonable and achievable Use multiple assessment methods; corroborate Mine/repurpose existing data Focus on users; how they work, find & use information Use the data to improve and add customer value Keep staff, customers and stakeholders involved and informed

47 Eye to the Future Measuring performance is an exercise in measuring the past. It is the use of that data to plan an improved future that is all important. Peter Brophy Data trends can inform the future Strategic planning can frame the future Organizational performance models can align ongoing operations with future aspirations Understanding how customers work, how that work is changing, and ways we can make customers and institutions success are key to the future of libraries

48 In Conclusion Can You Answer These Questions? What do we know about our communities and customers to provide services and resources to make them successful? How do we measure the effectiveness of our services, programs and resources from the customer perspective? What do our stakeholders need to know in order to provide the resources needed for a successful library?


Download ppt "I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries"

Similar presentations


Ads by Google