Download presentation
Presentation is loading. Please wait.
Published byLester Houston Modified over 6 years ago
1
Valuing Consortial Resources: A Framework for Assessment
Genya O’Gara (VIVA), Madeline Kelly (GMU), Julie Kane (W&L), Beth Blanton-Kent (UVA)
2
(Alternate title) Squishy Data!
3
What is VIVA? VIVA provides equitable, cooperative, and cost-effective access to library resources for the Commonwealth of Virginia’s nonprofit academic libraries serving the higher education community.
4
VIVA Members Public Colleges and Universities
Public Community and Two Year Colleges Private Nonprofit Institutions Educational and Research Institutions
5
VIVA Goals: “Level the academic playing field" for Virginia students and faculty by providing equal access to high-quality electronic resources at all member institutions. Save institutional money & staff time by avoiding duplication of collections and individual institutional efforts. Provide cooperative and cost-effective resources and services.
6
VIVA Funding Public Institutions Private Institutions Central Funding
from the General Assembly Member Institution Funds for Cost Shares and Opt-Ins Central Funding from the General Assembly for the Pooled Funds Program + Matching Member Institution Funds Member Institution Funds for Cost Shares and Opt-Ins
7
Additional reports, proceedings, and newspapers
VIVA Resources Almost 50,000 journals More than 7,500 videos 175 databases More than 2,000,000 Additional reports, proceedings, and newspapers More than 80,000 ebooks
8
Project Origins 2014-2016 biennium, VIVA received a 5% cut
Due to annual price increases in subscriptions no new money is effectively a cut to resources Needed standardized criteria to apply to the evaluation of very different resources The VIVA Collections Committee formed the Value Metric Task Force (VMTF) to figure out a consortial approach!
9
Goals VMTF Charge: Design and apply a framework for the coherent and holistic evaluation of VIVA products. Determine what the highest collection development priorities are for the consortium and examine how these can be translated into quantifiable values. The end result will be an assessment framework and value metric system for the evaluation of shared resources that are reflective of VIVA’s overarching values.
10
Goals VMTF considerations:
Potential factors to consider included relevance to programs, cost avoidance/list price discount, and usage. Usage factors may be further delineated to include total usage, usage by institution type, ratio of usage by top institution(s), and cost per use. Compare the value of dissimilar products to one another (apples TO oranges)
11
Group Madeline Kelly - George Mason University
Cheri Duncan - James Madison University Beth Blanton-Kent - University of Virginia Julie Kane - Washington & Lee University Crystal Newell - Piedmont Virginia Community College Summer Durrant - University of Mary Washington Genya O’Gara - Virtual Library of Virginia Anne Osterman - Virtual Library of Virginia
12
Approach Examined priorities for the consortium from “institution type” perspective (Community Colleges, Private Colleges, Comprehensives, Doctorals) Persona/brainstorming exercise surfaced institutional priorities Over 40% of brainstormed priorities were priorities for all 4 institutional types
13
Approach Questions included: Group considered factors such as:
For each product type, what data do we ALREADY collect? For each product type, what are ways in which libraries measure value? Group considered factors such as: Usage/cost-per use and annual percentage increases Subject relevance by institution; State priorities Impact factors, altmetrics, etc. Cost avoidance, price caps, and protection from model changes How to measure broad appeal Cost per FTE Data to determine program levels, degrees offered, FTE, etc
14
Approach Of these, what is:
Measurable, attainable, and easy to implement Adaptable by member institutions
15
Approach Reviewed product types and types of data ALREADY collected, completed a data inventory Surveyed member institutions for priorities specific to format
16
Putting it all together
Prioritized metrics by format using the survey results, Alignment with curriculum and cost being weighted the same for each format type VIVA “values” were included Metrics and data that could answer a particular question were mapped
17
Two Kinds of Grids Evaluating current resources
Evaluating prospective resources
18
Two Kinds of Grids (Cont’d)
For Evaluating Current Resources
19
Two Kinds of Grids (Cont’d)
For Evaluating Prospective (New) Resources Eliminated metrics: Usage by institution/institution type Length of consortium subscription Modified metrics: % of consortium’s subject content that comes would come from that resource Cost-per-use for existing subscribers, cost avoidance based on quoted price, and annual % increase based on current offer Quality of metadata based on sample records provided by vendor Frequency and nature of technical issues per existing subscribers Vendor responsiveness per existing subscribers Totally new metrics: Number of current subscribers Alignment with state collection mandates/priorities
20
Member Criteria The Database Grid Common across all formats
21
Subject Distribution of VIVA Resources
22
Member Criteria The Database Grid Common across all formats
23
Mapping to VA Curriculum
Madeline – We have a list of degree areas that is generated by the State Council of Higher Education for Virginia (SCHEV) Have mapped these to LC classes Using the top subjects for a particular resource, you can add up what percentage of state degrees are supported by that particular resource Again, the percentage corresponds to a specific score in the grid
24
Member Criteria The Database Grid Common across all formats
25
Metadata Checklist
26
Member Criteria The Database Grid Common across all formats
27
Platform Checklist
28
Member Criteria The Database Grid Common across all formats
29
Consortial Values The Database Grid
30
The E-book Grid: major differences
31
The E-book Grid, cont’d.
32
The Streaming Media grid: major differences
33
The Streaming Media Grid, cont’d.
34
The Journals Grid
35
Journal Grid – New Resource
36
Testing and Adoption For Evaluating Current Resources
37
Uses Understanding what we have, and what we value
Standardized way to review new & existing products Adaptable to member institutions individual evaluation needs Use data strategically to inform collection development Helps us to tell the story of what we provide to the state
38
Next steps Testing, testing, testing! Sharing, adapting
Upcoming state reversion, model in place
39
Questions?
40
Credits “Graph” by Joel McKinney - Noun Project
“Budget” by Ed Harrison, GB - Noun Project “Teamwork” by Yazzer Perez, MX - Noun Project “Bucket” by Anand A Nair and “Currency” by Creative Stall - Noun Project “Rulers” by parkjisun, Education Outlines Collection - Noun Project “Book” by Arthur Shlain, RU - Noun Project “Book” by Edward Boatman, US - Noun Project “Movie Clip ” by Netanel Koso, Israel - Noun Project “Options Database” by Arthur Shlain, RU - Noun Project “Quality Service” by Gregor Črešnar, Groups Vol. 5 Collection - Noun Project “Upstairs” by Bernar Novalyi – Noun Project
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.