Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge Pennsylvania State University Pennsylvania Library Association September.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Campus-wide Presentation May 14, PACE Results.
Project Monitoring Evaluation and Assessment
The National Declassification Center Releasing All We Can, Protecting What We Must Public Interest Declassification Board NDC Project Update April 22,
Collaborative Technical Services Team Report GUGM May 15, 2014 Cathy Jeffrey.
An Assessment Primer Fall 2007 Click here to begin.
Effects of Social Networks on Managing Electronic Records Carol E.B. Choksy, Ph.D., CRM, PMP School of Library and Information Science Indiana University,
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
EPIC Online Publishing Use and Costs Evaluation Program: Summary Report.
Procurement Transformation State of North Carolina
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Nadine Drew Lynn Goldman Merrie Meyers Charles Webster.
ASPEC Internal Auditor Training Version
The Role of Paraprofessionals in Technical Services in Academic Libraries: A Survey Lihong Zhu Head, Technical Services Washington State University Libraries.
Julia Bauder, Grinnell College & Jenny Emanuel, University of Illinois Be Where our Faculty Are: Emerging Technology Use and Faculty Information Seeking.
Internal Auditing and Outsourcing
Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge University at Albany, SUNY American Library Association ALCTS Affiliates.
LibQUAL + Surveying the Library’s Users Supervisor’s Meeting March 17, 2004.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
New PBIS Coaches Meeting September 2,  Gain knowledge about coaching  Acquire tips for effective coaching  Learn strategies to enhance coaching.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Group March18 March19 March 4:00 pm 5:30 pm Checklist of problems Checklist of objectives based on the problems identified Examine the case Brainstorming.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
Monash University Library Quality Cycle EXCELLENCE AND DIVERSITY and LEADING THE WAY Monash University’s strategic framework and overall directions MONASH.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Do “Traditional” Technical Services Librarians Still Exist in Academic Libraries? ALCTS Role of the Professional Librarian in Technical Services Interest.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Institutional Implementation: The Penn State Journey Nicola Kiver Executive Assistant to the Dean College of the Liberal Arts Cheryl Seybold Director of.
Data Summary July 27, Dealing with Perceptions! Used to quantifiable quality (collection size, # of journals, etc.) Survey of opinions or perceptions.
2010 Results. Today’s Agenda Results Summary 2010 CQS Strengths and Opportunities CQS Benchmarks Demographics Next Steps.
Working Definition of Program Evaluation
Administrative Forum Questionnaire Responses May 15, 2007.
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
NC STATE UNIVERSITY Campus Systems and Calendar Systems: a self assessment Sarah Noell, ITD, Project Coordinator Harry Nicholos, ITD, Technical co-chair.
In existence since 1981, the Commission for Women identifies areas of concern to women employees and students of Penn State, and suggests changes in existing.
How to use the VSS to design a National Strategy for the Development of Statistics (NSDS) 1.
Margaret Martin Gardiner Assessment Librarian The University of Western Ontario LibQUAL+2007 Canada 25 October 2007.
University Planning: Strategic Communication in Times of Change Cathy A. Fleuriet Ana Lisa Garza Texas State University-San Marcos Presented at the July.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
1 You are a New Member of the JAC; NOW WHAT? As a new Journey-Level Advisory Council (JAC) member, you probably have many questions, including those about.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
OIT Reorganization August 27, Today’s Agenda Principles of Reorganization Survey Feedback Organization Chart Leadership Team Structure Items to.
Western Carolina University Office of Assessment A Division of the Office of the Provost.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
TDRp Implementation Challenges David Vance, Executive Director Peggy Parskey, Assistant Director October 23, 2014.
LibQUAL+ ® Survey Administration LibQUAL+® Exchange Northumbria Florence, Italy August 17, 2009 Presented by: Martha Kyrillidou Senior Director, Statistics.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
Company: Cincinnati Insurance Company Position: IT Governance Risk & Compliance Service Manager Location: Fairfield, OH About the Company : The Cincinnati.
Development of the West Virginia University Electronic Theses & Dissertations System Presented By Haritha Garapati at ETD the 7 th International.
Efforts to Gauge Culture of Assessment by the Academic Assessment Committee at UF
Strategic Planning Chester County Library System Strategic Planning Steering Committee November 14, 2008 Gail Griffith.
Common Cause: Using Assessment to Advocate for Technical Services Rebecca L. Mugridge NOTSL April 24, 2015.
The Value of Soft Skills in Technical Services Penny Lochner, Head of Collection Resource Management Trexler Library, Muhlenberg College
November | 1 CONTINUING CARE COUNCIL Report to Forum Year
Business Process Review Academic Registry Student Systems and Administration Business Process Review Team Karen Williams February 2008.
Batchloading: Current Practices and Future Challenges Rebecca L. Mugridge Pennsylvania State University Libraries American Library Association January.
Rebecca L. Mugridge LFO Research Colloquium March 19, 2008.
Making Cross-campus, Inter-institutional Collaborations Work
Assignment of Cataloging Staff Levels at Penn State: a Case Study
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
Rebecca L. Mugridge Jeff Edmunds Pennsylvania Library Association
How to Design and Implement Research Outputs Repositories
SSarah The Value of Scholarly Communications Programming: Perspectives from Three Settings Sarah Beaubien • Scholarly Communications.
Using Student Survey Data to Build Campus Collaborations
Presented by: Skyline College SLOAC Committee Fall 2007
Agenda Purpose for Project Goals & Objectives Project Process & Status Common Themes Outcomes & Deliverables Next steps.
Presentation transcript:

Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge Pennsylvania State University Pennsylvania Library Association September 30, 2012

 2011 PaLA CRD Spring Workshop with Megan Oakleaf as keynote speaker  The Value of Academic Libraries: A Comprehensive Research Review and Report (Chicago: Association of College and Research Libraries, 2010)  There are 22 recommendations for next steps for librarians who wish to demonstrate value Why this topic?

 Communicating assessment needs and results to library stakeholders  Using evidence-based decision making  Creating confidence in library assessment efforts  Dedicating assessment personnel and training Recommendation: Mobilize library administrators (1)

 Fostering environments that encourage creativity and risk taking  Integrating library assessment within library planning, budget, and reward structures  Ensuring that assessment efforts have requisite resources Recommendation: Mobilize library administrators (2)

 Survey the academic libraries in Pennsylvania to determine:  Whether they conducted assessment of technical services  How they conducted assessment  How shared the results of their assessment activities  What actions they took based on their assessment activities Survey proposal

 Technical services staff equal 20% of our total staff  Very little published on technical services assessment  Most articles that do address assessment in technical services have to do with assessing specific processes  Interested in a broader approach to technical services assessment Why a focus on Technical Services?

For the purposes of the survey, technical services is defined as units responsible for:  Cataloging/Metadata  Acquisitions  Electronic resources management  Preservation/Bindery/Physical processing Technical Services

 It’s a large group (over 120 libraries) and I thought it would provide useful, generalizable results  These libraries might be inherently interested in the results, and therefore likely to participate Why Pennsylvania academic libraries?

Downloaded a list of academic libraries from this Pennsylvania Department of Education website: spx Included names of institution, library, director, and phone numbers. It did not include addresses! Academic libraries in Pennsylvania

 Organized alphabetically  Deleted duplicates and those without “college” or “university” in name of institution  Updated director names (list was not completely up- to-date)  Added addresses (with help)  Resulted in 129 academic libraries, but couldn’t find addresses for nine of them  End result: 120 libraries were survey candidates Academic libraries in Pennsylvania, cont’d

 63 responding libraries (53% response rate)  16 Public (25%)  47 Private (75%)  Average total employees: 13 librarians, 17 staff  Average total technical services employees: 2 librarians, 4 staff Demographics

 Original responses:  Yes: 36 libraries (60%)  No: 24 libraries (40%)  Adjusted responses based on answers to following question:  Yes: 57 libraries (90%)  No: 6 libraries (10%) Does your library conduct assessment of technical services?

 Gather statistics (84%)  Gather usage data (49%)  Gather input from non-technical services librarians (44%)  Collect anecdotes or feedback from customers (30%)  Conduct customer service surveys (25%) Specific assessment methods

 Benchmark with other institutions (19%)  Anonymous suggestion box (13%)  Conduct focus groups (10%)  Others included:  ROI studies of specific Technical Services functions  Time studies  Baldrige Assessment Process/360 Review  LibQUAL Specific assessment methods, cont’d

 Improve or streamline processes (68%)  Improve services (63%)  Make better decisions (62%)  Inform strategic planning activities (55%)  Explore offering new services (40%)  Reallocate staff or other services (30%)  Compare with other institutions (22%) Goals of technical services assessment

 Other:  Build better collections  Identifying activities or services that could be eliminated  Establish best practices based on national standards  Demonstrate value of technical services to the university and libraries  Demonstrate value to scholarship and research of original cataloging Goals of technical services assessment, cont’d

 Cataloging/Metadata (56%)  Acquisitions (56%)  Electronic resources management (45%)  Preservation/Bindery/Physical processing (26%) Technical Services units assessed within the past five years

 Library director/Dean/University librarian (38%)  Division head (20%)  Department head(s) (14%)  Unit head(s) (7%)  Committee (5%)  Single librarian (4%) Primary responsibility for conducting assessment

 Other:  Department responsible for assessment, but is also done at division, department, and unit level  Department and unit heads  Director, associate director, and staff  Library administrative team  Director and committee Primary responsibility for conducting assessment

 Annual report (61%)  Informational report to library administration (52%)  Mass to library employees (11%)  Library newsletter article (8%)  Presentations (8%)  Web site (5%)  Campus newsletter article (2%) How do you report the results of technical services assessment?

 Other:  Assessment report  5-year audit report  Department outcome assessment reports  Report to Provost  Internal discussions  Performance evaluations How do you report the results of technical services assessment?

 35 responses  Themes:  Streamlining processes  Staff reallocation  Changed vendors/Changed vendor services  Collection decisions  Training  Communication  New services  Changed ILSs Outcomes reported

Example: a.In past several years, implemented several staff reorganizations, job reassessments or upgrades, and adjustments to workflow. b.Streamlined work processes & procedures, developed training manuals and adapted policies to achieve financial and personnel efficiencies. c.Added several new services for faculty, such as new publications notification, new book display shelves, and improved book order/request system Outcomes reported, cont’d

Assessment in Practice: A Penn State Cataloging and Metadata Services Case Study

 Improve effectiveness  Identify areas for improvement  Communicate with customers  Communicate with administration  Lower costs  Help with decision making Why do Assessment?

 Workflow analysis and assessment with an outside facilitator  360 degree review  Customer surveys  Interviews or focus groups  Internal evaluation, assessment, or reviews Assessment activities

 Use facilitator  When useful:  Multiple units  Complex workflow  Workflow has been in place for a long time  Differences of opinion exist about how to address workflow changes Workflow analysis and assessment

 Include all stakeholders in process  Make an effort to understand the current process  Identify problem areas  Map new process and report back to sponsors  Follow up assessment How it works

 Video processing for Media Tech  Cataloging, Acquisitions, Media Tech  Government documents processing  Cataloging, Acquisitions, Social Sciences Library  Results of both efforts  Streamlined process with fewer hand-offs  Greater efficiencies  Faster turn-around times (Acquisition to Shelf) Two examples

 Digital Initiatives Steering Committee  Interviews with all stakeholders, team/committee members  What’s working? What’s not working? Suggestions for improvement? Communication?  Record themes that emerge from interviews  Ex.: communication issues, confusion about roles and responsibilities 360 degree review

 Applicable to operational departments as well as to some committees, working groups, etc.  Cataloging and Metadata Services (2011)  Queried subject and campus libraries  Not anonymous  One survey response per library Customer service survey

 At which branch, subject, or campus library do you work?  What services do we provide to your unit?  How happy are you with the following aspects of this service:  Speed of services  Quality of services  Speed of response to reported problems  If you wish, describe specific service experiences in detail.  Do you feel that you know to whom to talk about service issues as they arise? [Y/N] Survey questions

 How comfortable do you feel with the process of asking for help?  Not comfortable  Somewhat comfortable  Very comfortable  Are you able to find information or documentation on the Cataloging and Metadata Services website? [Y/N]  Describe your process for asking questions about cataloging services.  If you could see one new service provided to your library by Cataloging and Metadata Services, what would it be?  Do you have any additional comments? Survey questions, cont’d

 Can be done as part of a formal review process (e.g., 360 degree review)  Informally as part of a periodic “checking in” with customers  Example: Biannual meetings with subject library staff  Results: clarified policies and procedures; communicated upcoming changes in cataloging rules; answered assortment of questions Interviews or focus groups

 Annual cataloging reviews  Each cataloging team conducts own review  Develop own process  Write report  What was the process?  Training needs identified?  Policy issues identified?  Overall assessment of the process itself? Internal assessment

 Some of our efforts prove to be more effective than others  Did the assessment effort give you the information you need to move forward?  If not, you may choose another approach or refine your current approach Assess the assessment

Contact information: Rebecca L. Mugridge Head, Cataloging and Metadata Services 126 Paterno Library University Park, PA Questions?